# [Official] NVIDIA RTX 3080 Owner's Club



## zhrooms

_Last Updated: November 13, 2020_

*Note: This content is licensed under Creative Commons 3.0. This means that you are free to copy and redistribute this material, but only if the following criteria are met: 1) You must give appropriate credit by linking back to this thread. 2) You may not use this material for commercial purposes or place this on a for-profit website with ads. 3) You cannot create derivative work based on this material.

NVIDIA GeForce® RTX 3080

RTX 3070 Owner's Club
→ RTX 3080 Owner's Club
RTX 3090 Owner's Club

Click here to join the discussion on Discord or join directly through the Discord app with the code kkuFR3d









Source: NVIDIA

SPECS (Click Spoiler)



Spoiler






Rich (BB code):


 
   Architecture Ampere
   Chip GA102-200-A1
   Transistors 28,300 million
   Die Size 628 mm²
   Manufacturing Process Samsung 8nm

   CUDA Cores 4352 (8704)
   TMUs 272
   ROPs 96
   SM Count 68
   Tensor Cores 272
   GigaRays -- GR/s

   Core Clock 1440 MHz
   Boost Clock 1710 MHz
   Memory 10GB GDDR6X
   Memory Bus 320-bit
   Memory Clock 2376 MHz / 19000 MHz
   Memory Bandwidth 760 GB/s
   External Power Supply 12-Pin
   TDP 320W

   DirectX 12.2 Ultimate
   OpenGL 4.6
   OpenCL 2.0
   Vulkan 1.2
   CUDA 8.6

   Interface PCIe 4.0 x16
   Connectors 1x HDMI 2.1, 3x DisplayPort 1.4a
   Dimensions 285 x 112mm (2-Slot)

   Price $699 US

   Release Date September 17, 2020







Rich (BB code):


RTX 3090    | GA102-300 |  8nm | 628mm² | 28.3 BT | 5248 CCs* | 328 TMUs | 112 ROPs | 82 SMs | 1695 MHz |  24GB | 1024MB x 24 | GDDR6X | 384-bit | 936 GB/s | 350W
RTX 3080    | GA102-200 |  8nm | 628mm² | 28.3 BT | 4352 CCs* | 272 TMUs |  96 ROPs | 68 SMs | 1710 MHz |  10GB | 1024MB x 10 | GDDR6X | 320-bit | 760 GB/s | 320W
RTX 2080 Ti | TU102-300 | 12nm | 754mm² | 18.6 BT | 4352 CCs  | 272 TMUs |  88 ROPs | 68 SMs | 1635 MHz |  11GB | 1024MB x 11 | GDDR6  | 352-bit | 616 GB/s | 250W
RTX 2080 S  | TU104-450 | 12nm | 545mm² | 13.6 BT | 3072 CCs  | 192 TMUs |  64 ROPs | 48 SMs | 1815 MHz |   8GB | 1024MB x 8  | GDDR6  | 256-bit | 496 GB/s | 250W
RTX 2080    | TU104-400 | 12nm | 545mm² | 13.6 BT | 2944 CCs  | 184 TMUs |  64 ROPs | 46 SMs | 1710 MHz |   8GB | 1024MB x 8  | GDDR6  | 256-bit | 448 GB/s | 215W
GTX 1080 Ti | GP102-350 | 16nm | 471mm² | 12.0 BT | 3584 CCs  | 224 TMUs |  88 ROPs | 28 SMs | 1582 MHz |  11GB | 1024MB x 11 | GDDR5X | 352-bit | 484 GB/s | 250W
GTX 1080    | GP104-400 | 16nm | 314mm² |  7.2 BT | 2560 CCs  | 160 TMUs |  64 ROPs | 20 SMs | 1733 MHz |   8GB | 1024MB x 8  | GDDR5X | 256-bit | 320 GB/s | 180W
GTX 980 Ti  | GM200-310 | 28nm | 601mm² |  8.0 BT | 2816 CCs  | 172 TMUs |  96 ROPs | 22 SMs | 1076 MHz |   6GB |  512MB x 12 | GDDR5  | 384-bit | 336 GB/s | 250W
GTX 980     | GM204-400 | 28nm | 398mm² |  5.2 BT | 2048 CCs  | 128 TMUs |  64 ROPs | 16 SMs | 1216 MHz |   4GB |  512MB x 8  | GDDR5  | 256-bit | 224 GB/s | 165W
GTX 780 Ti  | GK110-425 | 28nm | 551mm² |  7.1 BT | 2880 CCs  | 240 TMUs |  48 ROPs | 15 SMs |  928 MHz |   3GB |  256MB x 12 | GDDR5  | 384-bit | 336 GB/s | 250W
GTX 780     | GK110-300 | 28nm | 551mm² |  7.1 BT | 2304 CCs  | 192 TMUs |  48 ROPs | 12 SMs |  900 MHz |   3GB |  256MB x 12 | GDDR5  | 384-bit | 288 GB/s | 250W
GTX 680     | GK104-400 | 28nm | 294mm² |  3.5 BT | 1536 CCs  | 128 TMUs |  32 ROPs |  8 SMs | 1058 MHz |   2GB |  256MB x 8  | GDDR5  | 256-bit | 192 GB/s | 200W
GTX 580     | GF110-375 | 40nm | 520mm² |  3.0 BT |  512 CCs  |  64 TMUs |  48 ROPs | 16 SMs |  772 MHz | 1.5GB |  128MB x 12 | GDDR5  | 384-bit | 192 GB/s | 250W
   * Can execute twice as many FP32 calculations per clock compared to previous generation when only executing FP32 operations,
   ⠀⠀thus marketed as 10496 and 8704 CUDA cores

The RTX 3080 and RTX 3090 share the same PCB design, with the exception of NVLink missing and no memory modules on the back of the RTX 3080. The NVIDIA Reference PCB (2x8-pin) has room for 9 power stages on the left and 11 power stages on the right, with 4 of them allocated for memory. This means that the 3080/3090 reference PCB has a maximum of 16 power stages for the GPU. But since the RTX 3080 is a cut down version of the RTX 3090 with lower power demands, there is no need to fill all of the positions with power stages, capacitors, and inductors. Instead, NVIDIA partners decide how many to use on reference PCB-based cards. NVIDIA's Founders Edition feature a custom PCB this time around, with 18 power stages total: 15 for the GPU and 3 for the memory. For comparison, we can see that (so far) no partner card using the NVIDIA Reference PCB (2x8-pin) goes below 13 power stages for the GPU and 3 power stages for the memory. Coincidentally, this is the exact VRM allocation of the 2080 Ti Reference PCB. Some cards. *​*
ASUS
AsusTek Computer (stylised as ASUS) was founded in Taipei, Taiwan in 1989, currently headquartered in Taipei, Taiwan.


MSRP+$ Per WattModelLengthSlotFanHDMIBIOSPowerStagesPower LimitBoostPCBPWMEANPN$3002.732EKWB216mm2Water112x8-Pin16320/366 W1710 MHzReference471801799143890YV0F60-M0NM00TUF300mm2.63222x8-Pin20320/375 W1710 MHzCustomuP9512R471801790932790YV0FB0-M0NM00$1602.293TUF OC300mm2.63222x8-Pin20340/375 W1785 MHzCustomuP9512R471801792276090YV0FB1-M0NM00Strix319mm2.83223x8-Pin22320/450 W1710 MHzCustomMP2888A471801790915090YV0FA0-M0NM00Strix LE319mm2.83223x8-Pin22320/450 W1710 MHzCustomMP2888A471108100280290YV0FA6-M0NM00$2302.066Strix OC319mm2.83223x8-Pin22370/450 W1905 MHzCustomMP2888A471801792878690YV0FA1-M0NM00$3502.333Strix OC LE319mm2.83223x8-Pin22370/450 W1905 MHzCustomMP2888A471108100292590YV0FA5-M0NM00
 
EVGA
EVGA Corporation was founded in California, United States in 1989, currently headquartered in California, United States.


MSRP+$ Per WattModelLengthSlotFanHDMIBIOSPowerStagesPower LimitBoostPCBPWMEANPN$1002.352XC3 Black285mm2.23112x8-Pin16320/340 W1710 MHzCustomuP9511R425081243835510G-P5-3881-KR$1202.240XC3285mm2.23112x8-Pin16340/366 W1710 MHzCustomuP9511R425081243831710G-P5-3883-KR$1402.295XC3 Ultra285mm2.23112x8-Pin16340/366 W1755 MHzCustomuP9511R425081243830010G-P5-3885-KR$1702.196XC3 Ultra263mm2Hybrid112x8-Pin16340/366 W1755 MHzCustomuP9511R425081243827010G-P5-3888-KR$2002.244XC3 Ultra263mm1.5Water112x8-Pin16340/366 W1755 MHzCustomuP9511R425081243826310G-P5-3889-KR$1601.911FTW3300mm2.753123x8-Pin22380/450 W1755 MHzCustomuP9511R425081243829410G-P5-3895-KR$1801.955FTW3 Ultra300mm2.753123x8-Pin22380/450 W1800 MHzCustomuP9511R425081243828710G-P5-3897-KR$2001.875FTW3 Ultra289mm2Hybrid123x8-Pin22380/450 W1800 MHzCustomuP9511R425081243825610G-P5-3898-KR$2301.917FTW3 Ultra289mm1.5Water123x8-Pin22380/450 W1800 MHzCustomuP9511R425081243824910G-P5-3899-KR

GALAX | KFA2 - Not available in North America
GALAXY was founded in Hong Kong, China in 1994, GALAXY and its European brand KFA2 (Kick Friggin Ass) merged in 2014 to form GALAX as a single unified brand, the name KFA2 still exist for the European market but all designs are GALAX, currently headquartered in Hong Kong, China.


MSRP+$ Per WattModelLengthSlotFanHDMIBIOSPowerStagesPower LimitBoostPCBPWMEANPNSG317mm33112x8-Pin16320/320 W1710 MHzReferenceuP9511R489514714106038NWM3MD99NNSG317mm33112x8-Pin16320/320 W1710 MHzReferenceuP9511R489514714118338NWM3MD99NK

GIGABYTE
GIGA-BYTE Technology (stylised as GIGABYTE) was founded in Taipei, Taiwan in 1986, currently headquartered in Taipei, Taiwan and California, United States.


MSRP+$ Per WattModelLengthSlotFanHDMIBIOSPowerStagesPower LimitBoostPCBPWMEANPN$6992.184Eagle3202.83212x8-Pin17320/320 W1710 MHzCustom4719331307851GV-N3080EAGLE-10GD$7292.144Eagle OC3202.83212x8-Pin17340/340 W1755 MHzCustom4719331307554GV-N3080EAGLE OC-10GD$7692.078Vision OC3202.753212x8-Pin17370/370 W1800 MHzCustom4719331307561GV-N3080VISION OC-10GD$7492.024Gaming OC3202.753222x8-Pin17370/370 W1800 MHzCustom4719331307530GV-N3080GAMING OC-10GD$8492.295AORUS Master3193.53322x8-Pin20370/370 W1845 MHzCustom4719331307639GV-N3080AORUS M-10GDAORUS Waterforce2522Hybrid322x8-Pin20370/370 W1845 MHzCustomGV-N3080AORUSX W-10GDAORUS Waterforce2522Water322x8-Pin20370/370 W1845 MHzCustomGV-N3080AORUSX WB-10GD$8991.998AORUS Xtreme3193.53323x8-Pin370/450 W1905 MHzCustom4719331307783GV-N3080AORUS X-10GD

INNO3D - Not available in North America
InnoVISION Multimedia was founded in Hong Kong, China in 1989, primarily recognized for its graphic cards marketed under the Inno3D brand, acquired by PC Partner in 2008, currently headquartered in Hong Kong, China.


MSRP+$ Per WattModelLengthSlotFanHDMIBIOSPowerStagesPower LimitBoostPCBPWMEANPNTwin X2 OC27432112x8-Pin16320/320 W1725 MHzReference0835168001589N30802-106XX-1810VA34iChill X330033112x8-Pin16340/340 W1770 MHzReference0835168001572C30803-106XX-1810VA37iChill X430033112x8-Pin16340/340 W1770 MHzReference0835168001565C30804-106XX-1810VA36iChill Frostbite2262Water112x8-Pin16340/340 W1770 MHzReference0835168001640C30803-106XX-1810VA37

MSI
Micro-Star International (stylised MSI) was founded in Taipei, Taiwan in 1986, currently headquartered in Taipei, Taiwan.


MSRP+$ Per WattModelLengthSlotFanHDMIBIOSPowerStagesPower LimitBoostPCBPWMEANPN$6992.184Ventus3052.853112x8-Pin16320/320 W1710 MHzCustom  $7392.309Ventus OC3052.853112x8-Pin16320/320 W1740 MHzCustom4719072762520V389-001RGaming Trio3232.83113x8-Pin16340/350 W1755 MHzCustom  $7592.169Gaming X Trio3232.83113x8-Pin16340/350 W1815 MHzCustom4719072762544V389-005RSuprim3123x8-PinCustomSuprim X3123x8-PinCustom

NVIDIA
Nvidia Corporation (stylised nVIDIA) was founded in California, United States in 1993, currently headquartered in California, United States.


MSRP+$ Per WattModelLengthSlotFanHDMIBIOSPowerStagesPower LimitBoostPCBPWMEANPN$6991.889Founders Edition285mm22111x12-Pin18320/370 W1710 MHzCustom812674024509900-1G133-2530-000

PALIT | GAINWARD - Not available in North America
Palit Microsystems (stylised PaLiT) was founded in Taipei, Taiwan in 1988, acquired the Gainward brand and company in 2005, currently headquartered in Taipei, Taiwan.


MSRP+$ Per WattModelLengthSlotFanHDMIBIOSPowerStagesPower LimitBoostPCBPWMEANPNPhoenix2942.73112x8-Pin17320/350 W1710 MHzReference4710562241952NED3080019IA-132AXGamingPro2942.73112x8-Pin17320/350 W1710 MHzReference4710562241945NED3080019IA-132AAPhoenix GS2942.73112x8-Pin17320/350 W1740 MHzReference4710562242010NED3080S19IA-132AXGamingPro OC2942.73112x8-Pin17320/350 W1740 MHzReference4710562242003NED3080S19IA-132AAPhantom3042.73123x8-Pin340/400 W1755 MHzCustom4710562242119NED3080U19IA-1020PGameRock30433123x8-Pin340/400 W1740 MHzCustom4710562242102NED3080U19IA-1020GPhantom GS3042.73123x8-Pin370/440 W1860 MHzCustom4710562242140NED3080H19IA-1020PGameRock OC30433123x8-Pin370/440 W1845 MHzCustom4710562242133NED3080H19IA-1020G

PNY
PNY Technologies was founded in New York, United States in 1985, currently headquartered in New Jersey, United States.


MSRP+$ Per WattModelLengthSlotFanHDMIBIOSPowerStagesPower LimitBoostPCBPWMEANPN$502.343XLR8 Uprising294mm2.73112x8-Pin17320/320 W1710 MHzReferenceNCP816100751492639550VCG308010TFXMPB$502.142XLR8 Revel294mm2.73112x8-Pin17320/350 W1710 MHzReferenceNCP816100751492639567VCG308010TFXPPB

ZOTAC
ZOTAC is under the umbrella of PC Partner, and was founded in Hong Kong, China in 2006, currently headquartered in Hong Kong, China.


MSRP+$ Per WattModelLengthSlotFanHDMIBIOSPowerStagesPower LimitBoostPCBPWMEANPN$202.142Trinity318mm2.93112x8-Pin16320/336 W1710 MHzCustomuP9511R4895173622403ZT-A30800D-10P$302.172Trinity OC318mm2.93112x8-Pin16320/336 W1725 MHzCustomuP9511R4895173622496ZT-A30800J-10PTrinity OC White318mm2.93112x8-Pin16320/336 W1740 MHzCustomuP9511R4895173622717ZT-A30800K-10PAMP Holo318mm2.93112x8-Pin19340/374 W1770 MHzCustomuP9511R4895173622465ZT-A30800F-10P

TECHPOWERUP | GPU-Z 

Download TechPowerUp GPU-Z


NVIDIA | NVFLASH 

Download NVIDIA NVFlash


BIOS | ROM 

TechPowerUp BIOS Collection < Verified 

TechPowerUp BIOS Collection < Unverified

FLASH | GUIDE (Click Spoiler)



Spoiler













└ Step 01 of 27 - Download NVFlash ┘ 










└ Step 02 of 27 - Downloads Folder ┘ 










└ Step 03 of 27 - Open Zip File ┘ 










└ Step 04 of 27 - Copy Files ┘ 










└ Step 05 of 27 - Create New Folder ┘ 










└ Step 06 of 27 - Name Folder ┘ 










└ Step 07 of 27 - Paste Files ┘ 










└ Step 08 of 27 - Installation Successful ┘ 










└ Step 09 of 27 - Find BIOS ┘ 










└ Step 10 of 27 - Download BIOS ┘ 










└ Step 11 of 27 - Name BIOS ┘ 










└ Step 12 of 27 - Copy or Cut BIOS ┘ 










└ Step 13 of 27 - Paste BIOS ┘ 










└ Step 14 of 27 - Download Successful ┘ 










└ Step 15 of 27 - Before Flash ┘ 










└ Step 16 of 27 - Maximum Power Limit (330W) ┘ 



















└ Step 17 of 27 - Starting Command Prompt (Administrator) ┘ 










chdir C:\nvflash

└ Step 18 of 27 - Changing Directory ┘ 










nvflash64 --protectoff

└ Step 19 of 27 - Disable Flash Protection ┘ 










nvflash64 --save Partner2080TiModel.rom

└ Step 20 of 27 - (Optional) BIOS Backup ┘ 










└ Step 21 of 27 - BIOS Saved ┘ 










nvflash64 -6 Partner2080TiModel.rom

Y

└ Step 22 of 27 - Flash BIOS ┘ 










Y

└ Step 23 of 27 - Confirm Update ┘ 










exit

└ Step 24 of 27 - Flash Completed ┘ 










└ Step 25 of 27 - After Flash ┘ 










└ Step 26 of 27 - Maximum Power Limit (380W) ┘ 










chdir C:\nvflash

nvflash64 --protecton

exit

└ Step 27 of 27 - (Optional) Enable Flash Protection ┘ 



OVERCLOCKING | TOOLS 

Download ASUS GPUTweak II

Download EVGA Precision X1

Download Gainward EXPERTool

Download Galax/KFA2 Xtreme Tuner Plus

Download Gigabyte AORUS Engine

Download Inno3D TuneIT

Download MSI Afterburner

Download Palit ThunderMaster

Download PNY Velocity X

Download Zotac FireStorm


QUESTIONS | FAQ 

Last Updated: November 13, 2020​
Question: Price per Watt?
Answer: Lower is Better! (Missing cards will be added as soon as they have power limit and price confirmed)

Values have been adjusted on Hybrid and Water cooled cards because of the effect that cooling a die has on effective power draw. The electrical resistance of silicon goes down as temperature decreases, power draw under identical loads and clocks also decrease. This effectively raises the power limit by approximately 30 and 35 Watt for Hybrid and Water, respectively.


ModelPower limit (Watt) / Above 320Price in USD (MSRP) / Above FEPrice per WattEVGA FTW3 3x8-Pin450W / +130 (41%)$789 / +90 (13%)1.753EVGA FTW3 Ultra 3x8-Pin450W / +130 (41%)$809 / +110 (16%)1.797ASUS TUF375W / +55 (17%)$699 / +0 (0%)1.864ASUS ROG Strix OC 3x8-Pin450W / +130 (41%)$849 / +150 (21%)1.886NVIDIA Founders Edition370W / +50 (16%)$699 / +0 (0%)1.889ASUS TUF OC375W / +55 (17%)$749 / +50 (7%)1.997Gigabyte AORUS Xtreme 3x8-Pin450W / +130 (41%)$899 / +200 (29%)1.998Gigabyte Gaming OC370W / +50 (16%)$749 / +50 (7%)2.024EVGA XC3366W / +46 (14%)$749 / +50 (7%)2.046Gigabyte Vision OC370W / +50 (16%)$769 / +70 (10%)2.078EVGA XC3 Ultra366W / +46 (14%)$769 / +70 (10%)2.101Zotac Trinity336W / +16 (5%)$719 / +20 (3%)2.139PNY XLR8 Revel350W / +30 (9%)$749 / +50 (7%)2.140Gigabyte Eagle OC340W / +20 (6%)$729 / +30 (4%)2.144EVGA XC3 Black340W / +20 (6%)$729 / +30 (4%)2.144MSI Gaming X Trio 3x8-Pin350W / +30 (9%)$759 / +60 (9%)2.169Zotac Trinity OC336W / +16 (5%)$729 / +30 (4%)2.170MSI Ventus320W / +0 (0%)$699 / +0 (0%)2.184Gigabyte Eagle320W / +0 (0%)$699 / +0 (0%)2.184Gigabyte AORUS Master370W / +50 (16%)$849 / +150 (21%)2.295MSI Ventus OC320W / +0 (0%)$739 / +40 (6%)2.309PNY XLR8 Uprising320W / +0 (0%)$749 / +50 (7%)2.340
*


----------



## zhrooms

Reserved


----------



## Mooncheese

Nvidia GeForce RTX 3080 Early Look: Ampere Architecture Performance - Hands-On!






2080 Ti here, this is enticing but there's a huge price gap between the 3080 and 3090 and NV essentially confirmed my suspicion that the 3090 is basically a renamed Titan, meaning, a 3080 Ti or 3080 S with 20GB of VRAM being dropped in response to Big Navi is very likely. Seeing as how I was late to upgrade from 1080 Ti to 2080 Ti (6 months ago, used, $900) I am probably going to wait for 3080 Ti / 3080 S.


----------



## Jaju123

Im all in my dudes.... 2080 to 3080 here I come. +80% performance = I'll take it


----------



## sakete

Ha, I'm coming from 980ti. +300% performance?


----------



## sakete

EVGA: https://www.evga.com/articles/01434/evga-geforce-rtx-30-series/


----------



## muffins

i would preorder the 3080 FE right now if i could. very annoyed nvidia is not taking preorders -_-.

though, i hate to ask this, but do you think a 3600x will bottleneck the 3080? nvidia is touting these cards as pci-express 4, but all their testing was on a intel 10900k which only uses pci-express 3. makes me feel like pci-express 4 is not needed, or at least, didn't help ryzen come out on top to be used over intel for their testing.

i feel like i should sell off my 3600x for a 10600k or something.


----------



## Sonac

3080 ftw!


----------



## Chargeit

muffins said:


> i would preorder the 3080 FE right now if i could. very annoyed nvidia is not taking preorders -_-.
> 
> though, i hate to ask this, but do you think a 3600x will bottleneck the 3080? nvidia is touting these cards as pci-express 4, but all their testing was on a intel 10900k which only uses pci-express 3. makes me feel like pci-express 4 is not needed, or at least, didn't help ryzen come out on top to be used over intel for their testing.
> 
> i feel like i should sell off my 3600x for a 10600k or something.


I wouldn't until we see what kind of affect pcie 3.0/4.0 has.


----------



## Killmassacre

I was hesitant to get a 3080 since it only has 10GB of VRAM, but I used my time machine to preorder one just in case (it sold out in 2 minutes!).


----------



## sakete

Killmassacre said:


> I was hesitant to get a 3080 since it only has 10GB of VRAM, but I used my time machine to preorder one just in case (it sold out in 2 minutes!).


Time machine?


----------



## jwsg

The size, price, and probably performance, are all better for 3080 than people expected I think, whereas the price of the 3090 is now outside most people's reach. 

So until there is a 3080/20 on the market - and that might not be until GDDR6x prices come down - it comes down to whether 10GB is enough - esp.if youre on a 11GB 1080ti looking to upgrade.

Although adding more cores increases performance in any game, just adding more VRAM doesnt and few games allocate up to available size. Only a small %ge of people have 11 or more GB of VRAM and game developers know this - and this isnt going to change now Nvidia are sticking with 10GB - so I think 10GB is still a safe option going forward.


----------



## sblantipodi

isn't 10GB too few for a new cards considering that 1080Ti had 11GB and 2080Ti 11GB?


----------



## mouacyk

I'm rather curious if the 320-bit SKU was a die harvest issue, or were they choosing to limit VRAM to cut expensive G6X costs. Imagine buying 2x1GB Micron G6X chips yourself, soldering them on, and either getting a working 352-bit or full 384-bit card. Self-made 3085/3080 TI. Hopes aside, still a curious question.


----------



## sblantipodi

mouacyk said:


> I'm rather curious if the 320-bit SKU was a die harvest issue, or were they choosing to limit VRAM to cut expensive G6X costs. Imagine buying 2x1GB Micron G6X chips yourself, soldering them on, and either getting a working 352-bit or full 384-bit card. Self-made 3085/3080 TI. Hopes aside, still a curious question.


Jensen says that 3080 is the flagship, a flagship that have less ram than a 1080Ti?
Ok I like the price but there is a gap between 3080 and 3090 that will be fitted soon.

Imho we will see a 3080Ti soon after Christmas


----------



## sakete

Guess I'll be looking at the EVGA 3080 XC3 Ultra. That should be the reference PCB design based on previously released cards, and easiest to get waterblocks for, such as Heatkiller. EVGA just confirmed that Hydro Copper will come later, no ETA. So I'm going with third party waterblock.


----------



## Chargeit

I'm leaning towards the FE card since I've never owned a reference card. How's people experience with them? This new design seems extremely interesting. 

As far as the 3080 and vram goes, I've tested various games I play over the last few days and the most vram I've seen used was 6gb at 3440x1440. Feel like newer games might start flirting with 8gb+ though 10gb should last until the 40xx come out. Regardless, I plan on keeping my 1080 ti as a back up. I'll pull the 3080 about a month out from the 40xx release and flip it. Point being, I don't see the 3080 as being a longer term option for me and feel like it's more of a stopgap until next gen. I'm hoping by then AMD has shown its cards and hopefully brought competition back to the higher end.


----------



## Killmassacre

sakete said:


> Time machine?


Just making a pun at the fact that there is an owners club before the card is even available for preorder!

Although to be fair I will be 3080 owner soon. I'm upgrading from a 2070 super which is only getting me 20-30 fps in MSFS. One thing that bugs me though is that NVIDIA says the 3080 is twice as fast as the 2080, but they only added 2GB more VRAM. If it's twice as fast shouldn't it have 16GB of VRAM? Although I'm not gaming in 8k or even 4k, just a measly 3440X1440 so I'll probably be ok... I hope.


----------



## Chargeit

Was able to push that New World game over 10gb vram. Maxed out over 10gb vram and over 16gb system ram usage but didn't screenshot it. Think there's some kind of memory leak. Though I went afk and when I came back it had cut down to 9.5gb vram and 12.5gb system ram usage.


----------



## Awsan

So is my power supply useless now?


----------



## Arni90

Awsan said:


> So is my power supply useless now?


Yes, but don't worry. You can send it to me and I can recycle it for you free of charge
/s

Your PSU is by no means useless, but it might be a bit small if you're planning on running a 3080 as well as a 3950X at full throttle at the same time.
Worst case scenario is that you trigger the over current protection while doing something really stressful, that will simply cause the computer to reboot with no damage done.


----------



## Mooncheese

Chargeit said:


> I'm leaning towards the FE card since I've never owned a reference card. How's people experience with them? This new design seems extremely interesting.
> 
> As far as the 3080 and vram goes, I've tested various games I play over the last few days and the most vram I've seen used was 6gb at 3440x1440. Feel like newer games might start flirting with 8gb+ though 10gb should last until the 40xx come out. Regardless, I plan on keeping my 1080 ti as a back up. I'll pull the 3080 about a month out from the 40xx release and flip it. Point being, I don't see the 3080 as being a longer term option for me and feel like it's more of a stopgap until next gen. I'm hoping by then AMD has shown its cards and hopefully brought competition back to the higher end.


I see 9GB in Shadow of War and The Division 2 @ 3440x1440 (by no means and exhaustive list, just the first two that come to mind), and 10.5GB in Half Life Alyx. 8 and 10GB is not future proof and GDDR6X isn't THAT expensive. 

The good news is that there is clearly a gaping chasm between the $700 3080 and the $1500 titan class replacement 3090, NGreedia most definitely have a 20GB 3080S / 3080 Ti up their sleeve that they are saving for Big Navi. If youre similarly bothered with the VRAM debacle and have a 2080 Ti, just wait for that. That's what I'm doing.


----------



## Chargeit

Mooncheese said:


> I see 9GB in Shadow of War and The Division 2 @ 3440x1440 (by no means and exhaustive list, just the first two that come to mind), and 10.5GB in Half Life Alyx. 8 and 10GB is not future proof and GDDR6X isn't THAT expensive.
> 
> The good news is that there is clearly a gaping chasm between the $700 3080 and the $1500 titan class replacement 3090, NGreedia most definitely have a 20GB 3080S / 3080 Ti up their sleeve that they are saving for Big Navi. If youre similarly bothered with the VRAM debacle and have a 2080 Ti, just wait for that. That's what I'm doing.


Yeah I was able to get that New World game over 10gb last night. Other games I tested might of gone higher if I played for an extended period. 

Still on a 1080 ti. Noticed in that New World game once the Vram started filling up the frame timing of the game started feeling poor. I've noticed in other games I play that after an extended play session (4+ hours) the frame timing can feel off. Not sure if those games are maxing out my vram but I'm guessing they max out what they can use and then start streaming textures leading to worse frame timing. I haven't measured this but something I noticed while watching Vram usage in New World and how it seemed to affect frame timing.


----------



## mouacyk

1080 Ti showing itself out now. Bye.


----------



## Imprezzion

Do you guys think they are going to do the whole non-a and a chip debacle again? What's you guy's take / expectations on that?

The way I see it is that I knew what I was getting into money wise with a 2080 Ti. Even if I sell that for like €400, it's still only a €300 upgrade for 80% performance if the numbers are actually true. That's actually not bad at all lol.


----------



## arrow0309

Imprezzion said:


> Do you guys think they are going to do the whole non-a and a chip debacle again? What's you guy's take / expectations on that?
> 
> The way I see it is that I knew what I was getting into money wise with a 2080 Ti. Even if I sell that for like €400, it's still only a €300 upgrade for 80% performance if the numbers are actually true. That's actually not bad at all lol.


You should try to obtain between 500 and 550 euro for your 2080ti. 
I'd start the selling listing right away, even with 600.

Anyway, I just sold my 2080 Super Strix for £480 and I'm in for a 3080 as well (might probably go for the FE).


----------



## Imprezzion

arrow0309 said:


> You should try to obtain between 500 and 550 euro for your 2080ti.
> I'd start the selling listing right away, even with 600.
> 
> Anyway, I just sold my 2080 Super Strix for £480 and I'm in for a 3080 as well (might probably go for the FE).


I have no proper spare GPU's at the moment. Only a whole stack of R9 280x's and 7970's and a GTX760 lol.

I won't go for FE just cause of the wierd PCB not really allowing for cooling mods like a AIO + Kraken G12 or whatever aftermarket cooling that isn't a custom waterblock. I don't custom watercool my cards as I switch cards quite often and can't be bothered to spend a ton on waterblocks all the time.

I really love the look of the ASUS ROG cards cooler so I will probably go for a ROG one this time if the cooler performs well and has proper VRM and VRM cooling.


----------



## b.walker36

sakete said:


> Ha, I'm coming from 980ti. +300% performance?


We are like the same person lol


----------



## odin24seven

What I would like to know is how is this going to run triple monitor 144hz 1440p 1ms ms. I have a 1080ti now and do ok at about 60+ fps on BF5, or should I look at the 3090?


----------



## Mad Pistol

I've got an LG CX OLED TV that is BEGGING for one of these to power it.

I can't wait!


----------



## Newbie2009

I'm in.

I'm also looking forward to seeing people crying about the psu wires melting.


----------



## Mooncheese

https://www.pcgamer.com/watch-nvidias-rtx-3080-rip-and-tear-the-2080-ti-at-4k-in-doom-eternal/


----------



## Anth0789

Yeah im debating either to upgrade to this or 3070.


----------



## GeneralCuster44

Question- is the founder's edition worth it or do you wait for a FTW3 or the other models? It seems like the clocks are a bit better with the other models, so just curious what do people think?


----------



## Mad Pistol

GeneralCuster44 said:


> Question- is the founder's edition worth it or do you wait for a FTW3 or the other models? It seems like the clocks are a bit better with the other models, so just curious what do people think?


I'm getting whichever I can snag first. As much power as is required to run the 3080 and 3090, that's where the limit is going to be. I'd be surprised if 3rd party designs can overcome this.


----------



## sakete

Any news on which boards will have a reference PCB? Want to be able to slap a waterblock on a 3080.


----------



## liang333

Let's go, 3090.


----------



## sakete

liang333 said:


> Let's go, 3090.


This is the 3080 thread.


----------



## liang333

sakete said:


> This is the 3080 thread.


I meant it as a joke, but it didn't come out well. xD Sorry.
I think I'm getting 3080, 3090 is just too much. Feels like if you don't get a 8k TV is pointless too.


----------



## sakete

liang333 said:


> I meant it as a joke, but it didn't come out well. xD Sorry.
> I think I'm getting 3080, 3090 is just too much. Feels like if you don't get a 8k TV is pointless too.


I guess it depends on your needs, and I still have a 1440p/144hz monitor, so 3080 will be plenty and a huge upgrade for me coming from a 980ti.

I just hope more reliable info for waterblocks comes out in the next few days as that will influence which card I buy. Would strongly prefer EVGA either way as their warranty isn't voided when installing a custom block.


----------



## nick name

You guys see the leaks over at videocardz yet?









Alleged GeForce RTX 3080 graphics card test leaks online - VideoCardz.com


Chinese Bilibili channel TecLab has leaked alleged benchmark results of the GeForce RTX 3080. NVIDIA GeForce RTX 3080 review leaks A video showing the synthetic and gaming performance of the GeForce RTX 3080 graphics card has emerged online. The leakers claim to own a sample and a working...




videocardz.com





The leaked 3080 3D Mark figures compared to a 2080 Ti are reportedly all better (if the leaks are accurate).


----------



## profundido

Killmassacre said:


> I was hesitant to get a 3080 since it only has 10GB of VRAM, but I used my time machine to preorder one just in case (it sold out in 2 minutes!).


Aaah that's why we need the new 12pin Power adapter: To get the required 1.21 Gigawatts ;-)


----------



## gtz

Just pre ordered mine thru b and H photo. Can't wait to start messing with the card.

Now just waiting for Zen 3


----------



## GoldCartGamer

The specifications for the MSI 3080 Gaming X Trio are on the site now. It has a boost clock of 1815 MHz.






GeForce RTX™ 3080 GAMING X TRIO 10G


MSI GeForce RTX™ 3080 GAMING X TRIO 10G features the TRI FROZR 2 thermal design, which brings the most advanced technology for ultimate cooling performance. It features the new TORX FAN 4.0, core pipe and airflow control combined with groundbreaking aerod




www.msi.com


----------



## alexp247365

My napkin math says a 3090 at stock will be 30 percent faster than an overclocked (~2000mhz) 2080ti. My 2080ti oc is about 10-15 percent above stock. I would hope that we get at least that much headroom on the 3090. So, the optimistic side of me thinks the 3090 over-clocked on water could do anywhere from 45-60 percent better than a 2080ti on water. Thoughts?


----------



## rexbinary

I am moving from a Strix 1080ti to a Strix 3080 unless I decide to wait for a 3080ti...hmm...


----------



## Chargeit

alexp247365 said:


> My napkin math says a 3090 at stock will be 30 percent faster than an overclocked (~2000mhz) 2080ti. My 2080ti oc is about 10-15 percent above stock. I would hope that we get at least that much headroom on the 3090. So, the optimistic side of me thinks the 3090 over-clocked on water could do anywhere from 45-60 percent better than a 2080ti on water. Thoughts?


***[Official] NVIDIA RTX 3090 Owner's Club


----------



## rares495

nick name said:


> You guys see the leaks over at videocardz yet?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Alleged GeForce RTX 3080 graphics card test leaks online - VideoCardz.com
> 
> 
> Chinese Bilibili channel TecLab has leaked alleged benchmark results of the GeForce RTX 3080. NVIDIA GeForce RTX 3080 review leaks A video showing the synthetic and gaming performance of the GeForce RTX 3080 graphics card has emerged online. The leakers claim to own a sample and a working...
> 
> 
> 
> 
> videocardz.com
> 
> 
> 
> 
> 
> The leaked 3080 3D Mark figures compared to a 2080 Ti are reportedly all better (if the leaks are accurate).


*cries in 2080 Super*

Might snag a 3080 if the price is right, but it probably won't be. So far it's looking like the 3080 will cost ~860 EUR which to me seems like a lot for an x80 card. The 2080 Super was hovering around 720 EUR for the ****ty cheap coolers and even that is waaaaay too much.


----------



## PraiseKek

Big Boy Card


----------



## shilka

NDA for the RTX 3080 is now on the 16th of septemeber which is my birthday lol
Anyway i am looking to get either an RTX 3070 or an RTX 3080 and since my trip to Spain was dropped this year i can spend the extra for the 3080 so might as well

Looking at all the cards there is not really any of them i like other than the Asus TUF and the Zotac Trinity and the last Asus TUF cards where a total trainwreck so i am not too keen on those unless Asus have done a better job this time around

That leaves the Zotac Trinity as pretty much the only option that i like but i have never owned anything from Zotac before so any Zotac GPU owners have anything to say about them?

I plan on putting my MSI RTX 2070 Super Gaming X Trio back in my second PC since the old crappy GTX 1060 3gb is nowhere near powerful enough since i bought an LG GL850 and moved my Asus PG279Q over to the second PC so the RTX 2070 Super will be put to good use there

Thanks


----------



## shallow_

I really want an ASUS Strix card, but as they seem to be the most popular, Im afraid those will sell out first. If I the try to get one, i might also miss out on some of the others.

My no.2 is the MSI Gaming trio, love that it comes with sag support bracket.

Have allied myself with a friend who will also be ready on the dot on thursday to try to buy, will see what we get.

If I end up with more than 1 I will eiter return to store, or sell locally at cost. No scalping.


----------



## Quantium40

Welp peeps, its finally time to upgrade from the old GTX 1070. 

Whether its an FE 3080, third party 3080, or an equivalent AMD will depend on bench numbers on the 16th, availability on the 17th, and, failing that, my ability to get one and/or prices being too far out of MSRP range following that.

Failing to get a card shortly will see waiting till real availability without sky-high prices and very possibly waiting to see what AMD has with Navi 21/22.

So... it could soon or even as late as next year before I actually get a 3080 or equivalent lol


----------



## whipple16

I was able to place a order for a ASUS TUF 3080 ON B&H a few days ago. The took the money out of the account so hopefully it holds and im in the first group of shipping!!


----------



## Chargeit

I'm going to go FE or bust on the 3080 10gb. That cooler just looks beefy like it means serious business. Also like how the rear fan blows heat out of the case which should help with how power hungry these will end up being. Should be interesting. Took pto on the 17th so I'm going to be sitting at my computer waiting for them to come in stock. If I don't get one, then they sold out in seconds. 🐂


----------



## gtz

whipple16 said:


> I was able to place a order for a ASUS TUF 3080 ON B&H a few days ago. The took the money out of the account so hopefully it holds and im in the first group of shipping!!


There was actually a Q&A thread on the BandH product page that was removed probably due to them not technically allowing pre orders. On the Q&A it was stated that if you were able to place an order that the item would either ship on the 16th or 17th. Amazon also had pre orders available for a short time on the eleventh as well but I was too late for those, they had more options. That is where all the other RTX3080 brands on eBay came from including the FE model. My friend managed to order an FE and seems amazon will honor the sale.

I ordered that TUF as well in the event I won't be able to order a FE, rumor has it the FE is an expensive card to make and nVidia just launched it to have a 699 price tag and will only produce a hand full and make there money from the partner cards that are more expensive. 

Moores Law is Dead posted a video of this as well and now I am starting to believe that nvidia did just what I said above.


----------



## cstkl1

GoldCartGamer said:


> The specifications for the MSI 3080 Gaming X Trio are on the site now. It has a boost clock of 1815 MHz.
> 
> 
> 
> 
> 
> 
> GeForce RTX™ 3080 GAMING X TRIO 10G
> 
> 
> MSI GeForce RTX™ 3080 GAMING X TRIO 10G features the TRI FROZR 2 thermal design, which brings the most advanced technology for ultimate cooling performance. It features the new TORX FAN 4.0, core pipe and airflow control combined with groundbreaking aerod
> 
> 
> 
> 
> www.msi.com


its a ref card with 3x8pin


----------



## shallow_

gtz said:


> There was actually a Q&A thread on the BandH product page that was removed probably due to them not technically allowing pre orders. On the Q&A it was stated that if you were able to place an order that the item would either ship on the 16th or 17th. Amazon also had pre orders available for a short time on the eleventh as well but I was too late for those, they had more options. That is where all the other RTX3080 brands on eBay came from including the FE model. My friend managed to order an FE and seems amazon will honor the sale.
> 
> I ordered that TUF as well in the event I won't be able to order a FE, rumor has it the FE is an expensive card to make and nVidia just launched it to have a 699 price tag and will only produce a hand full and make there money from the partner cards that are more expensive.





whipple16 said:


> I was able to place a order for a ASUS TUF 3080 ON B&H a few days ago. The took the money out of the account so hopefully it holds and im in the first group of shipping!!


It now looks like Amazon and B and H are having to cancel the preorders


__
https://www.reddit.com/r/nvidia/comments/iqv1dl

I do think this is only fair as there is a worldwide no-preorder rule..


----------



## gtz

shallow_ said:


> It now looks like Amazon and B and H are having to cancel the preorders
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/iqv1dl
> 
> I do think this is only fair as there is a worldwide no-preorder rule..


BandH just released the funds back to my credit card. So this might be true. However my order has not been cancelled yet so here's to hoping.


----------



## sakete

gtz said:


> BandH just released the funds back to my credit card. So this might be true. However my order has not been cancelled yet so here's to hoping.


They typically only place a hold and don't charge until it ships. At least, that's how they do it with cameras.


----------



## gtz

sakete said:


> They typically only place a hold and don't charge until it ships. At least, that's how they do it with cameras.


Correct however the hold still shows on the activity page. The hold and everything associated with it is gone. The credit shows nothing.


----------



## Mooncheese

I think I'm going to wait until November to see what NV have up their sleeve in response to Big Navi, by then we will all have a better idea as to what is the best 3090 variant in regards to water-cooling (and cooling the top bank of VRAM) and overclocking in regards to power delivery. 

Hell if I wait long enough, who knows, maybe NV will do a repeat of the Maxwell where they released the 80 Ti card only 4 months after the Titan (Titan: Nov 2013, 980 Ti: Feb, 2014) 

I posted this over at EVGA forum: 




AHowes said:


> I'm sure one would have no problem selling said card for retail or more if they want to jump to another model later on.


_Even after NV release 20GB 3080 and 16GB 3070 variants within a quarter in response to Big Navi (who will have a 16GB card that will sit in between the 3070 and 3080 in rasterization @ $550 and 24GB card that will sit in between the 3080 and 3090 @ $1k? Remember the 1060? They sold 3GB and 6GB variants simultaneously and the 6GB variants had considerably higher resale value.






See the gaping chasm in price and performance between the 3080 and the 3090? Yeah, NGreedia have something up their sleeve for Big Navi, whether that will be a 20GB 3080 or 3080S is the question. 

Everyone buying first wave ampere expecting them to be at the top of the stack and to hold value goes against what Nvdia has done with their product releases historically. 

First wave refresh NEVER hold their value. 

See: 

RTX 2070

GTX 1080

GTX 1060

GTX 980 _


----------



## nick name

whipple16 said:


> I was able to place a order for a ASUS TUF 3080 ON B&H a few days ago. The took the money out of the account so hopefully it holds and im in the first group of shipping!!


Really? I thought Nvidia told sellers they couldn't pre-sell or take reservations. When sites reported that they also included messages from sellers that were returning "deposits" to customers.


----------



## whipple16

nick name said:


> Really? I thought Nvidia told sellers they couldn't pre-sell or take reservations. When sites reported that they also included messages from sellers that were returning "deposits" to customers.


They pulled the money out on the 10th when i ordered and as of sunday night i have nothing from them about refunding me or anything. Unless I get shipping conformation by the time the cards are offically released im still gonna try and get one. worse case i end up with 2 and shouldnt have a problem getting rid of the one i dont want.


----------



## CalinTM

I have a 1080 ti strix. And I'm moving to a 3080 strix. My only concern is that 10 gb vram. But I game at 1080p with high refresh rate and gsync. So I think for the next 3-4 years to come gaming at 1080p those 10gb will be enough for the 90% of games to come, gaming at 1080p not higher. But, there's a but, I remember in some later games my 1080 ti strix was loading about 8-9 vram, even almost 10gb vram at 1080p. So I'm concerned about those 10gb vram that 3080 has. I'm not waiting for a 3080 ti with 20% better horse power posibile, or with a 20 gb vram, I'm taking this one and that's it.
As for the price, I've seen that in Germany I think some of the 3080 models are having a price already, about 850 euros I think. Well that's not much, just yesterday I've reviewed my 2017 order with 1080 ti, from my local store here in romania and for that card I've paid almost the same price. So I'm hoping it will be the same for the 3080 strix, maybe a little expensive like tops +50-60 euros. All this with the Included tax and with the store's internal taxing. So it's OK, so I say... We will see.


----------



## gtz

whipple16 said:


> They pulled the money out on the 10th when i ordered and as of sunday night i have nothing from them about refunding me or anything. Unless I get shipping conformation by the time the cards are offically released im still gonna try and get one. worse case i end up with 2 and shouldnt have a problem getting rid of the one i dont want.
> View attachment 2458834


B and H Photo has not cancelled my order, I ordered it around 5am Friday morning. But all holds disappeared on my credit card yesterday. Usually if I preorder or backorder which was the case with my X570i Strix board the credit had a hold until delivery. Currently the hold is gone altogether with no history which has me a little worried. Time will tell.


----------



## rluker5

Does anyone know the time and time zone the FE will go up for sale?
I never went reference before, but they keep getting better.


----------



## gtz

rluker5 said:


> Does anyone know the time and time zone the FE will go up for sale?
> I never went reference before, but they keep getting better.


Just keep refreshing on the 17th, never trust a set time. Best Buy and Nvidia will have founders and chances are so will PNYs website (they have always in the past). Something to remember, people always go to the stores like newegg, amazon, etc and forget you can buy directly from PNY, Zotac, and even Dell.


----------



## shallow_

rluker5 said:


> Does anyone know the time and time zone the FE will go up for sale?
> I never went reference before, but they keep getting better.


The time is 6 a.m pacific I believe. It is 15.00 cet (norway/europe)

It is the same time for the next launch, _the GeForce RTX 3070 will be available on October 15th at 6 a.m. Pacific Time._


----------



## sakete

shallow_ said:


> The time is 6 a.m pacific I believe. It is 15.00 cet (norway/europe)
> 
> It is the same time for the next launch, _the GeForce RTX 3070 will be available on October 15th at 6 a.m. Pacific Time._


Is that just the time for the FE, or also for the AIB cards?


----------



## rluker5

shallow_ said:


> The time is 6 a.m pacific I believe. It is 15.00 cet (norway/europe)
> 
> It is the same time for the next launch, _the GeForce RTX 3070 will be available on October 15th at 6 a.m. Pacific Time._


Thank you very much. Looks like I will be quietly disappearing at work at 8:00


----------



## shallow_

rluker5 said:


> Does anyone know the time and time zone the FE will go up for sale?
> I never went reference before, but they keep getting better.





rluker5 said:


> Thank you very much. Looks like I will be quietly disappearing at work at 8:00


I have already taken half day off 

This release will surely test some e-tailers systems..


----------



## shallow_

sakete said:


> Is that just the time for the FE, or also for the AIB cards?


From what I understand the sale time is same for all, I thin only the review embargo is different.

Anyone feel free to correct me if wrong.


----------



## DarthBaggins

If I could pre-order I would, but I'll be waiting till Dec or Jan to get one - hoping it truely is a fitting replacement for my 1080ti (has been one of the best cards I've been fortunate to have).


----------



## Giustaf

I pre-ordered 3080 strix (non OC) on the french amazon site... 817€ I hope they don't cancel my order


----------



## zhrooms

Original post updated, now lists available and upcoming cards for the western market.

Personally leaning towards *not* getting a 3090, because it's just *20%* faster, for *more than twice* the price, NVIDIA has never released a gaming card with worse price/performance ratio than this.


----------



## sakete

FTW3 cards not available on Thursday:

 https://twitter.com/i/web/status/1305894136159940608


----------



## shiokarai

zhrooms said:


> Original post updated, now lists available and upcoming cards for the western market.
> 
> Personally leaning towards *not* getting a 3090, because it's just *20%* faster, for *more than twice* the price, NVIDIA has never released a gaming card with worse price/performance ratio than this.


My sentiment exactly. The RTX 3090 price is just too much, even after RTX 2080 Ti prices... I'm almost certain there will be RTX 3080 20GB, maybe Q1 2021 after micron releases higher density mem chips.


----------



## Awsan

Any news on which are "Bin 2" and if we are getting higher wattage than 320?


----------



## t1337dude

zhrooms said:


> Original post updated, now lists available and upcoming cards for the western market.
> 
> Personally leaning towards *not* getting a 3090, because it's just *20%* faster, for *more than twice* the price, NVIDIA has never released a gaming card with worse price/performance ratio than this.


True. On the other hand, it is the fastest card and $1500 isn't a crazy amount of money, especially in terms of "high-end things".


shiokarai said:


> My sentiment exactly. The RTX 3090 price is just too much, even after RTX 2080 Ti prices... I'm almost certain there will be RTX 3080 20GB, maybe Q1 2021 after micron releases higher density mem chips.


It's clearly not a value-oriented card, and rarely do the top-end cards offer great value relative to the lower line models. Having said that, I see people regularly waste greater amounts of money on various things with less pay-off. If you're going to blow an extra $800, blowing it on these videocards isn't an awful idea for someone who considers themselves to be an enthusiast. The government was just handing out unemployment to a large amount of Americans, which paid significantly more than their job due to the extra $600/wk stipend. Just one week of that stipend covers most of the difference between the 3080 and 3090.


----------



## pewpewlazer

t1337dude said:


> True. On the other hand, *it is the fastest card and $1500 isn't a crazy amount of money*, especially in terms of "high-end things".


True, but when you consider that it costs $300 more than last generation's "fastest card", which cost $500 more than the "fastest card" the generation before that... It's rather expensive.

And 114% more expensive than a 3080 for (hopefully) ~20% more performance? That's REALLY bad no matter how you spin it.

If you're someone who has been dropping a grand or more on the _Titan_ cards since 2013, then $1500 ain't no thang for you. Hell, compared to the Titan RTX at $2500, the 3090 is an absolute steal. But if the 3090 is indeed the spiritual successor to the "Titan" cards, then it's extremely weird that they dropped the _Titan_ moniker while simultaneously dropping the price. They're not exactly marketing it as a "prosumer" or professional/non-gaming card either.



t1337dude said:


> The government was just handing out unemployment to a large amount of Americans, which paid significantly more than their job due to the extra $600/wk stipend. Just one week of that stipend covers most of the difference between the 3080 and 3090.


So you're suggesting that folks who are now unfortunately unemployed due to "_COVID-19_" should buy a 3090 because the government is giving them money? Fantastic financial advice!


----------



## t1337dude

pewpewlazer said:


> True, but when you consider that it costs $300 more than last generation's "fastest card", which cost $500 more than the "fastest card" the generation before that... It's rather expensive.
> 
> And 114% more expensive than a 3080 for (hopefully) ~20% more performance? That's REALLY bad no matter how you spin it.
> 
> If you're someone who has been dropping a grand or more on the _Titan_ cards since 2013, then $1500 ain't no thang for you. Hell, compared to the Titan RTX at $2500, the 3090 is an absolute steal. But if the 3090 is indeed the spiritual successor to the "Titan" cards, then it's extremely weird that they dropped the _Titan_ moniker while simultaneously dropping the price. They're not exactly marketing it as a "prosumer" or professional/non-gaming card either.
> 
> 
> 
> So you're suggesting that folks who are now unfortunately unemployed due to "_COVID-19_" should buy a 3090 because the government is giving them money? Fantastic financial advice!


Missing the point here. The point isn't that people should buy the 3090 with extra free money from the government...my point is : why wouldn't they? I know a few people who bought new cars with their unemployment money. It's not that the money wouldn't be better saved, but you should know by now the story of Americans and money - easy come, easy go.

Yes, the 3090 is a bad value if you're thinking purely in terms of price/performance. But people who buy the 3090 aren't under the impression they're getting the best deal, they are buying it because they want the best card, and that's why I'll be buying one.


----------



## MacMus

is it worth to replace 2080ti with 3080 ?


----------



## AuraNova

MacMus said:


> is it worth to replace 2080ti with 3080 ?


Not really. I mean, yeah, there's an assumed performance gain there. I just feel that no matter what point you bought a 2080Ti, it's still a good card. I would say just enjoy what you can out of the 2080Ti and wait until a 40 series card comes out. Unless you have money to throw away and want the latest and greatest, then by all means get the 3080.


----------



## BrokenSpring_12

Iv'e been slowly getting back into gaming on my pc and my 1060 6GB is wanting to be replaced. I feel the pull of the Zotac 3080.
I only have a 1080p 60htz monitor for now so not planning to move from 1080p yet.
Do you think i will be limited much with a B350 board and a Ryzen 1600?

I'm pretty excited to see real benchmarks, whens the embargo up?


----------



## gtz

So overall 20 - 30% performance uplift in 4K vs the 2080Ti. Impressive but honestly if you are a 2080Ti owner you still have a very capable graphics card. I was going to cancel my Asus TUF pre order, because a) I don't really need it and b) it is ugly in my opinion. I offered it to one of my friends that missed the Amazon and BandH pre orders of last week for what I paid and he took it. But for now the 2080Ti will still serve me right.

I will wait and see what AMD or if a 3080Super or Ti comes out.


----------



## sakete

gtz said:


> So overall 20 - 30% performance uplift in 4K vs the 2080Ti. Impressive but honestly if you are a 2080Ti owner you still have a very capable graphics card. I was going to cancel my Asus TUF pre order, because a) I don't really need it and b) it is ugly in my opinion. I offered it to one of my friends that missed the Amazon and BandH pre orders of last week for what I paid and he took it. But for now the 2080Ti will still serve me right.
> 
> I will wait and see what AMD or if a 3080Super or Ti comes out.


Yeah, makes total sense if you have a 2080ti. For me with a 980ti, this will be a yuuuuuuge improvement for me


----------



## gtz

sakete said:


> Yeah, makes total sense if you have a 2080ti. For me with a 980ti, this will be a yuuuuuuge improvement for me


You will get a huge increase. I really miss my old 980Ti, honestly it was my favorite cards.


----------



## zhrooms

Let me remind everyone;

2080 Ti Reference PCB - $99 Asetek 280mm AIO + $25 NZXT G12 Adapter + 2x 120mm Fans PWM $25 = $150
Galax XOC BIOS and Curve Editor 2175MHz and +1200 Memory

Time Spy DX12 1440p
2080 Ti FE 1635MHz Stock Boost = 13.600 Graphics Score
2080 Ti 2175MHz Manual OC = 17.635 Graphics Score (+29.7%)
3080 FE 1710MHz Stock Boost = 17.800 Graphics Score (+0.9%)

3080 is roughly 0-10% faster depending on OC and which game and resolution, so far I've seen 4K+RT results up to 15%, that's where it will shine.

For $699 it's good ut not what we hoped for, and 3090 is just 20% faster meaning this 0-10% will be 20-30%, not even close to what we got going from Pascal to Turing, which was 50%. GeForce 30 series is really disappointing, but I'm very likely swapping out my Ti for 3080 anyway, 15% 4K+RT performance is something at least, plus HDMI 2.1.


----------



## padman

Comparing crazily OCd 2080Ti to stock 3080 doesn't look too good but as soon as we get high power limit bioses for 3080 I'm quite confident the diffrence will be back to 20-25% range.


----------



## shiokarai

Without unlocked/higher limit BIOSes 30 series will look really not that exciting compared to the heavy OCEd RTX 2080 ti and the question is if there will be any unlocked BIOSes available. At the 1440p it's underwhelming, to say the least.


----------



## zhrooms

padman said:


> Comparing crazily OCd 2080Ti to stock 3080 doesn't look too good but as soon as we get high power limit bioses for 3080 I'm quite confident the diffrence will be back to 20-25% range.


That is not "crazily OCd", that's regular overclocking with any AIO or Water block, if it throttles or not is up to the BIOS, if you want to go crazy you use 1.125V and 2205MHz.


----------



## zhrooms

shiokarai said:


> Without unlocked/higher limit BIOSes 30 series will look really not that exciting compared to the heavy OCEd RTX 2080 ti and the question is if there will be any unlocked BIOSes available. At the 1440p it's underwhelming, to say the least.


Yes, this is true, we have no idea if 30 series will get BIOS flashing, most LN2 cards and BIOSes will be avilable for 3090 cards, not 3080.
All 2x8-Pin cards are running max 375W, regular NVIDIA Reference PCB 2x8-Pin runs 320W stock and max 109% 350W while NVIDIA Custom PCB FE 1x12-Pin runs 320W stock and 370W max.
RTX 2080 Ti with XOC BIOS and great cooling such as the biggest triple fan air cooled cards, AIO (NZXT G12) and Water blocked cards with highest overclock will get extremely close to an overclocked 375W max 2x8-Pin 3080, we're talking about a few %, won't be noticeable in games at all, literally something like 130 to 134 FPS as an example. And 3090 is not much better, it's 20% faster which should bring it up to maybe 20-25% at 370W, 130 to 160 FPS (22.5%) but costs $800 more than a 3080, I would have loved 3090 at $1199, but $300 above that for the cheapest 2x8-Pin models is a big no no, buying a 3090 without 3x8-Pin is such a mistake, paying that much for a very limited card when it comes to overclocking, 350W stock and 370W max power limit, 20 extra watt will barely get you any overclocking, so only 3x8-Pin cards with higher power limits of 450+ will produce some worthwhile overclocking.

I will still get 3080 because of slight improvement in RTX games (up to 15% over 2080 Ti overclocked), but I have no rush at all to do it, might take months before we have any actual stock at retailers, and by that point we have black friday coming, could maybe get a 3080 custom PCB 3x8-Pin for 2x8-Pin price.

This also means that 2080 Ti still holds their value very well, new 3080s will be very hard to find and cost at least €800 around Europe, meaning if you want similar performance without waiting you can just get a 2080 Ti for €699 used (with a few months store warranty left). I'd say minimum value of the worst Ti is €599, best value €749 for the most premium models, but somewhere in between for the rest, I'm definitely not selling my Ti for less than €649, it performs near identical (17.635 Time Spy Graphics Score compared to 3080 at 17.800).


----------



## shallow_

So I called the biggest computerparts seller in Norway today, asking if they could give out info about how many of each card they have received in stock for tomorrows release.

They would not, they did not have this info they said, there is going to be an info meeting for the staff tomorrow morning,

Now I asked them if they had ever seen such an anticipated launch, and they said it was like this every release..

Is that true ?

Was the 2080/ti release this tense/sought after ??

For me it seems tomorrow is going to be black thursday x10


----------



## zhrooms

shallow_ said:


> Was the 2080/ti release this tense/sought after ??


Yes, 2080 release was a disaster, it took up to 2 months for Custom PCBs to arrive, I ordered my card about an hour after they went live and got my card after 30 days and I was one of the first. It was insane, and everything is pointing to the same thing happening now, most people won't be able to get cards for several weeks minimum. Anyone who panic sold their Ti's a week or two ago might have to be without a gaming GPU for up to 90 more days. And then they learn that their 2080 Ti overclocked was almost as fast.


----------



## shallow_

zhrooms said:


> Anyone who panic sold their Ti's a week or two ago might have to be without a gaming GPU for up to 90 more days. And then they learn that their 2080 Ti overclocked was almost as fast.


Yeah, all the latest used 2080ti ads now refer to how little faster the 3080 is..

Im in no hurry to get the new card, im only in a hurry to get it at release retail price..


----------



## Addsome

zhrooms said:


> Let me remind everyone;
> 
> 2080 Ti Reference PCB - $99 Asetek 280mm AIO + $25 NZXT G12 Adapter + 2x 120mm Fans PWM $25 = $150
> Galax XOC BIOS and Curve Editor 2175MHz and +1200 Memory
> 
> Time Spy DX12 1440p
> 2080 Ti FE 1635MHz Stock Boost = 13.600 Graphics Score
> 2080 Ti 2175MHz Manual OC = 17.635 Graphics Score (+29.7%)
> 3080 FE 1710MHz Stock Boost = 17.800 Graphics Score (+0.9%)
> 
> 3080 is roughly 0-10% faster depending on OC and which game and resolution, so far I've seen 4K+RT results up to 15%, that's where it will shine.
> 
> For $699 it's good ut not what we hoped for, and 3090 is just 20% faster meaning this 0-10% will be 20-30%, not even close to what we got going from Pascal to Turing, which was 50%. GeForce 30 series is really disappointing, but I'm very likely swapping out my Ti for 3080 anyway, 15% 4K+RT performance is something at least, plus HDMI 2.1.


What Asetek 280mm AIOs are for $99? I want to mount the G12 and a 240mm-280mm AIO on my Zotac AMP 2080ti. Theres a sale on Corsair H110 for $100 CAD which is around $75USD but i'm afraid of buying such an old stock AIO and having leaks or degraded cooling performance because its been sitting on a shelf for 5+ years.


----------



## maltamonk

BrokenSpring_12 said:


> Iv'e been slowly getting back into gaming on my pc and my 1060 6GB is wanting to be replaced. I feel the pull of the Zotac 3080.
> I only have a 1080p 60htz monitor for now so not planning to move from 1080p yet.
> Do you think i will be limited much with a B350 board and a Ryzen 1600?
> 
> I'm pretty excited to see real benchmarks, whens the embargo up?


The 3080 is not for you as you'd be bottlenecked in several places. Instead you could use the price difference to balance out your system.


----------



## chuuurles

Will my 5930k be bottle necked by this @ 1440p? also may move to triples in the future for my sim racing. Hard to find info about this old and obscure chip.


----------



## nick name

chuuurles said:


> Will my 5930k be bottle necked by this @ 1440p? also may move to triples in the future for my sim racing. Hard to find info about this old and obscure chip.


From what I've seen in reviews it will in some games.


----------



## Thoth420

I am using a [redacted] CPU. We will call it enicaR and cooling it via the water in Lake Michigan. Think it will bottleneck at 3440 x 1440?


----------



## chuuurles

That depends on what depth u are pulling the water from.


----------



## t1337dude

So...the MSI Gaming X is probably the AIB to grab at launch?


----------



## Alemancio

The MSI Gaming X? Based on what? We don't know anything.


----------



## Thoth420

I can say that I absolutely loved my Gaming X Trio 2080 Ti. Whisper quiet. She was big though but this gen I don't see avoiding that anyways.


----------



## Imprezzion

Is there anything known about power limits in 3rd party cards like MSI Gaming X Trio or ASUS ROG Strix OC? All the specsheets on their own webpages just list 320w (MSI lists 340w).

They should go on sale in about 5 hours here and I wanna try and pick one up before they all sell out instantly but I wanna know which one to get lol.

I'm kinda torn between the MSI GXT and ASUS ROG OC, they both look incredible looks-wise but from experience I know ASUS usually has full custom PCB's that don't really allow for BIOS flashes or power mods easily and MSI usually does have more options there. Plus, for the RTX2xxx at least, they had a way higher power limit then other brands and as the above post said, the GXT coolers are incredible noise and temp wise on the RTX2XXX series.


----------



## VPII

This is where I as a South African prefer according to price not assumed quality. We are privileged enough to get brands not available in the USA, such as Palit. Now having used a Palit RTX 2080 Ti Gamingpro OC for the past year and 8 months I can confirm that the card is great. Off the bat it came with a 300 watt power limit, not really enough when overclocking but still great. Was running the card with an XOC 1000 watt bios and NZXT Kraken G12 and Corsair H110 cooler for the time stated and it did really well.

Now my reason for stating Palit is that here is South Africa when you buy Asus, MSI, Evga or Gigabyte you'll be paying 250 to 325 usd more for the card. So take your pick.


----------



## t1337dude

Alemancio said:


> The MSI Gaming X? Based on what? We don't know anything.


There's plenty we do know by this point - basically everything save for benchmarks. We've have had the MSI Gaming X cards for awhile on previous generations and they're usually the most competitive in terms of air-cooling performance. Similarly, FE cards and their inferior cooling performance is also to be expected based on previous trends. But please, don't buy the card at launch - wait for the benchmarks


----------



## Imprezzion

You have got to be joking.. a local e-tailer just listed pricing for the AIB models... €1499 for a 3080... That has got to be a joke...


----------



## ZealotKi11er

So has anyone been able to pick on up?


----------



## sakete

ZealotKi11er said:


> So has anyone been able to pick on up?


All the websites are down, haha. EVGA.com has crashed, Newegg.com has crashed. And Amazon isn't allowing ordering yet.


----------



## BigMack70

Did anyone get one off Nvidia's website? Been refreshing past 15 minutes and never seen anything but "notify me". This looks like the definition of paper launch.


----------



## sakete

Pretty sure it's already sold out everywhere now. Best Buy was showing "Notify Me" for an EVGA card, I refreshed and then it showed "Sold Out".


----------



## Imprezzion

I put an order in with an e-tailer for a MSI Gaming X Trio and payment passed and order is on being processed. We'll see later today if I get a tracking code for a shipment or a email saying sorry we don't have it.

If the order passes I will have it tomorrow.


----------



## rluker5

I saw that on nvidia, but this phone is aggravatingly inept so there is that


----------



## shallow_

I have ordered an MSI Gaming X Trio from proshop.no, and it was listed in stock before AND after my payment clearing. ETA 4-5 days I think.

I also placed an order for an ASUS rog strix OC from komplett.no. This card was not in stock but eta to seller tomorrow.

I will only keep one card, but I had a 10% off + $35 off that was used on the asus card so Im technically paying the same for both.


----------



## Imprezzion

Everything is on 10+ working days now so.. i wonder if i was in time with my order lol. The website kept crashing during the payment screen so it took my 7-8 minutes to actually order it..


----------



## Anth0789

Yeah newegg is down, i had one in the cart as soon as I wanted to pay it crashed, almost seems impossible to get one.


----------



## sakete

Anth0789 said:


> Yeah newegg is down, i had one in the cart as soon as I wanted to pay it crashed, almost seems impossible to get one.


Bummer. I was able to snag one on EVGA.com luckily. Took almost an hour though, their site was super slow and got multiple errors.


----------



## Shawnb99

EVGA site shows all as out of stock


----------



## Thoth420

Anth0789 said:


> Yeah newegg is down, i had one in the cart as soon as I wanted to pay it crashed, almost seems impossible to get one.


Same here buddy. I didn't even stop to look at the model and it was crashed anyways. Looks like a long camp at Microcenter next week.


----------



## BigMack70

Shawnb99 said:


> EVGA site shows all as out of stock


It looks officially gone from everywhere online in the US. Unless Nvidia decides to actually sell some from its website, but it looks like they never did.


----------



## Shawnb99

BigMack70 said:


> It looks officially gone from everywhere online in the US. Unless Nvidia decides to actually sell some from its website, but it looks like they never did.


Next week is not going to be fun.


----------



## sjd

MSI Gaming X Trio power limited at 350W. What was the point of having 3x8 pin?


----------



## acoustic

Did anyone get one? I had the XC3 Ultra in my cart on EVGA's page, but I couldn't get to checkout. I'm not really upset about it, since I wanted the 3090 anyway, and I'm not a fan of the AIB cards outside of the FTW3 Ultra or the Gigabyte AORUS Xtreme .. but holy hell, that was insane how destroyed the servers were.

I hope some real people got cards..


----------



## sakete

acoustic said:


> Did anyone get one? I had the XC3 Ultra in my cart on EVGA's page, but I couldn't get to checkout. I'm not really upset about it, since I wanted the 3090 anyway, and I'm not a fan of the AIB cards outside of the FTW3 Ultra or the Gigabyte AORUS Xtreme .. but holy hell, that was insane how destroyed the servers were.
> 
> I hope some real people got cards..


I got one, should arrive tomorrow. XC3 Ultra. Took me an hour to get through evga.com checkout process.


----------



## BigMack70

in stock on Nvidia


----------



## nick name

BigMack70 said:


> in stock on Nvidia


Not cool, man. Not cool.


----------



## BigMack70

nick name said:


> Not cool, man. Not cool.













If it makes you feel any better, it looks like nvidia is going to do the same thing EVGA did to me - let me put it in the cart and then crash before I can checkout, and then the whole thing will timeout and I'll not get a card


----------



## shallow_

This how ppl are feeling ??


----------



## Avacado

shallow_ said:


> This how ppl are feeling ??


Hilarious


----------



## mattxx88

sjd said:


> MSI Gaming X Trio power limited at 350W. What was the point of having 3x8 pin?


+1


----------



## BigMack70

Yup. Didn't get it. Crashed during checkout process and by the time the website recovered, it had removed it from my card.


----------



## Imprezzion

The e-tailer I just ordered mine with sent out a statement saying 800 orders went through before they shut down the ordering and they can ship out a very small amount of them.
The rest of the orders are not expected to be delivered within 2 months. There is no stock and no deliveries in sight.


----------



## chuuurles

I went to Canada computers 10 mins before opening this morning. About 10-15 ppl in line, Owner came out and said they received ONE unit and it went to the dude who lined up last night at 12am :/


----------



## Imprezzion

Nvidia themselves told the e-tailer to not expect any new cards for at least 2 months and AIB's also don't have any stock.

I'm really hoping I was on time and I can actually get a card lol. I'd feel like a special boi haha.


----------



## nick name

BigMack70 said:


> Yup. Didn't get it. Crashed during checkout process and by the time the website recovered, it had removed it from my card.


Best Buy keeps screwing up at the shipping option. Can't set store pickup and it errors when shipping to me.


----------



## Shadowdane

sjd said:


> MSI Gaming X Trio power limited at 350W. What was the point of having 3x8 pin?


Yah I was kinda set on getting that card but the max limit of 350W is pretty restrictive.. the review I saw the power limit slider only goes up to 102% what a joke!


----------



## mouacyk

Shadowdane said:


> Yah I was kinda set on getting that card but the max limit of 350W is pretty restrictive.. the review I saw the power limit slider only goes up to 102% what a joke!


Had a great run with MSI's Ti series, but yeah that measly 3% to max at 350W doesn't make any sense, since FE can go to 370W. In line with that blunder, they offer the worst VRM and 3x8pins for nothing. Their Turing offerings weren't great either. Pretty sure MSI has thrown in the towel, when it comes to creating enthusiast products.


----------



## zhrooms

MSI Gaming X Trio is the worst card released today yes, 16 Power Stages, 350W power limit.. on a 3x8-Pin PCB that allows for up to 525W. They literally gave up, not even trying.

Meanwhile ASUS TUF OC has 20 Power Stages, 375W power limit on 2x8-Pin (max 375W), about 25mm shorter than Gaming X Trio, has an extra HDMI 2.1 port, and costs about €100 less here in Scandinavia. *ASUS won today, big time.*


----------



## padman

Shadowdane said:


> Yah I was kinda set on getting that card but the max limit of 350W is pretty restrictive.. the review I saw the power limit slider only goes up to 102% what a joke!


Default power limit for Gaming X Trio is 340W and can only be increased to 350W. This is absolutely hilarious. Even Founders goes up to 370W.
Why do they build a custom PCB with 3 power sockets for up to 525W of power consumption but then limit it to 340 watts (+ ridiculous 10 watts OC)?


----------



## Sir Beregond

sjd said:


> MSI Gaming X Trio power limited at 350W. What was the point of having 3x8 pin?


Lol. Good question. What a joke MSI.


----------



## shallow_

zhrooms said:


> MSI Gaming X Trio is the worst card released today yes, 16 Power Stages, 350W power limit.. on a 3x8-Pin PCB that allows for up to 525W. They literally gave up, not even trying.
> 
> Meanwhile ASUS TUF OC has 20 Power Stages, 375W power limit on 2x8-Pin (max 375W), about 25mm shorter than Gaming X Trio, has an extra HDMI 2.1 port, and costs about €100 less here in Scandinavia. *ASUS won today, big time.*


Ill admit, I have never OC'd a GPU, and so these limits dont really mean that much to me. But I think i heard it mentioned in one review that with 375w max on card, only 2x 8pin with 375 max leaves no headroom.. 

Buit is this not something that can be changed in the future with a firmware update or something ? or is the limit HW and unchangable ?

I got a 3x 8pin card just to show off as many of the sleeved extension cables I purchased for potential SLI upgrades in the past


----------



## Sir Beregond

shallow_ said:


> Buit is this not something that can be changed in the future with a firmware update or something ? or is the limit HW and unchangable ?


I'm not entirely sure, but I think getting around power limitations usually requires shunt modding?


----------



## Mad Pistol

I was looking forward to joining the club... but not today I suppose.

Scalpers can go die in a dumpster fire.


----------



## shallow_

I have now read several reviews of the MSI Gaming X Trio, and none of those reviews fault the power limit the way it has been presented here in thread.

With 2 cards on the way (the MSI arriving first, the Asus Strix OC I read is postponed until 21 sept.), I would ofcourse like to keep the best one, and am therefore in need of knowledge to support my choice. 

And dont worry, as I mentioned before, the card im not keeping i will let go at cost.


----------



## mouacyk

shallow_ said:


> I have now read several reviews of the MSI Gaming X Trio, and none of those reviews fault the power limit the way it has been presented here in thread.
> 
> With 2 cards on the way (the MSI arriving first, the Asus Strix OC I read is postponed until 21 sept.), I would ofcourse like to keep the best one, and am therefore in need of knowledge to support my choice.
> 
> And dont worry, as I mentioned before, the card im not keeping i will let go at cost.


I was actually surprised it had the best OC performance at TPU, even though it had the worst power limit at 350W. This might be attributed to either unit quality or NVidia binning, and might not apply across the board to MSI's 3080 stack. They have a history of sending reviewers higher TDP variants to show more performance than retail samples. It's possible in this case, they sent TPU a high-quality bin.


----------



## Imprezzion

I'd settle for any model right now.. if it's a cheap one with crappy cooler I'll just strap my Kraken G12 + X62 to it lol.

Bit disappointed with MSI here but it wouldn't be the first time a BIOS update from them enhanced the power limits after launch. 

I am kinda thinking if my order doesn't get shipped before October 1st I will cancel it and go for a ASUS model.


----------



## shallow_

at 2:16.12 the MSI guys answer a user question from @DK about the power limit.


----------



## BigMack70

Imprezzion said:


> I'd settle for any model right now.


Me too. I just want anything that has an HDMI 2.1 port at this point. I'll get around to buying the exact specific model I want later once supply issues are resolved, likely a 3090 Hybrid or Kingpin if I can find one.


----------



## shiokarai

Anyone (except bots/scripts) was able to actually buy a RTX 3080 FE from nvidia store today? Was it just a razor-thin paper launch? It sure seems so!


----------



## Larkonian

I wouldn't be surprised if all the Nvidia chips of the current crop of cards are BIOS locked to 350w max. Possibly the FTW/Strix/Aorus etc. will have an *"A"* chip with higher power limits like with Turing.


----------



## gerardfraser

Just bought a Zotac RTX 3080 on Newegg Canada @ 5:36PM confirmed and bought MSI RTX 3080 Gaming Trio on Amazon Canada a few hours ago,but no ship date yet.


----------



## Stephen.

Ebay is just getting ridiculous now, look at this troll listing

Manufacturers Warranty 420 Years


----------



## sakete

Stephen. said:


> Ebay is just getting ridiculous now, look at this troll listing
> 
> Manufacturers Warranty 420 Years


Haha. Well sometimes there are people with more money and sense that would actually fall for it.

Many listings of $2k+ though. I got my hands on an XC3 Ultra, maybe I should flip it for $1.5k and get a 3090... nah.


----------



## Stephen.

sakete said:


> Haha. Well sometimes there are people with more money and sense that would actually fall for it.
> 
> Many listings of $2k+ though. I got my hands on an XC3 Ultra, maybe I should flip it for $1.5k and get a 3090... nah.


That would make sense to flip it for the better card, if you didn't have the funds originally for it. There's allot of "buy it now" from $1500-$2000, whether they're legit or not is not known, and luckily eBay has buyer protections.

But if I'm spending 84 grand on a GPU, it better come with either a free M3 Competition, or a diamond encrusted Rolex Daytona.

As you can see with this listing, the bidders identities are protected, that I've never seen on eBay before.

RTX 3080 Yusuf Amir Edition


----------



## HyperMatrix

sakete said:


> Haha. Well sometimes there are people with more money and sense that would actually fall for it.
> 
> Many listings of $2k+ though. I got my hands on an XC3 Ultra, maybe I should flip it for $1.5k and get a 3090... nah.


Do it. The only thing worse than Scalpers is the idiots who enable them by paying those prices.


----------



## sakete

HyperMatrix said:


> Do it. The only thing worse than Scalpers is the idiots who enable them by paying those prices.


Nah, I'm not gonna do that as an enthusiast. However, I am tempted to try to get my hands on a 3080 FTW3 once those go on sale. If I do get my hands on one, I might sell it here at cost (well, cost + taxes and shipping I paid for it). That, or return it back to EVGA if I haven't opened it yet (they only accept returns for refund on unopened merchandise).


----------



## Imprezzion

I got offered a MSI Ventus 3080 for 850.
Should I do it and cancel my order for the Gaming X Trio which I ordered for the same price?

What's you guy's thoughts and input?


----------



## HyperMatrix

Imprezzion said:


> I got offered a MSI Ventus 3080 for 850.
> Should I do it and cancel my order for the Gaming X Trio which I ordered for the same price?
> 
> What's you guy's thoughts and input?


There a reason for you considering this? Gaming X Trio is a higher tier card than the Ventus. Have you seen something in the benchmarks that says the Ventus is better?


----------



## KingEngineRevUp

Gigabyte Gaming OC
So far at +110 on the core with temperatures at 69C and 83% fan speed. Can't hear the fans at all.

If stress test through timespy passes, going onto OC memory.

Edit: Card is drawing 360W+. I'm very surprised MSI capped their cards at 350W. I'm happy I didn't grab a trios now.


----------



## mismatchedyes

FTW3 card has 380w stock power limit with upto 5% extra.

Seems like the highest so far but still not really making much use of the 3x 8 pin powers!


----------



## padman

mismatchedyes said:


> FTW3 card has 380w stock power limit with upto 5% extra.
> 
> Seems like the highest so far but still not really making much use of the 3x 8 pin powers!


 FTW3 has 420W power limit max confirmed on twitter by EVGA rep

__ https://twitter.com/i/web/status/1306756154496700416


----------



## HyperMatrix

mismatchedyes said:


> FTW3 card has 380w stock power limit with upto 5% extra.
> 
> Seems like the highest so far but still not really making much use of the 3x 8 pin powers!


There's a custom power unlocked bios that may or may not get leaked. Currently on stock bios it's hitting 405W draw.




padman said:


> FTW3 has 420W power limit max confirmed on twitter by EVGA rep
> 
> __ https://twitter.com/i/web/status/1306756154496700416


I wish they'd state the power limit on the 3090. If they boost it by 20% to 500W for the extra 20% cores I'd be happy.


----------



## zhrooms

*MSI Livestream* "MSI RTX 3080: The New Performance King" *on September 17, 2020*
_This stream provides a closer look at the MSI GeForce RTX *3080* *GAMING X TRIO* model (and VENTUS). _

*Question:* Why does the 350W power limit exist and will it be removed in the future?

*Pieter:* You can see what it's capable of doing now, but, _ehm_, will it be removed? I, I don't, _eh_, expect so, no, _ehm_, these cards are, are designed this way and as you can see, I mean even with a higher power cap, _eh_, _like_, _eh_, I'm, I'm not sure if it's completely used in the, _eh_, numbers that we saw, from, _eh_, I think the, the overclocked numbers, _eh_, I'm not sure if we can go back to the slide, but, it would be interesting to know but I, I don't think, that, you know, even if we put the, up, _eh_, if that would directly, give you more power and even if it did, _eh_, it would, you would, you would only be talking about a couple of percentages, _ehm_.
*Peter:* Yeah indeed.
*Pieter:* And, and, and the, the number of, the amount of power that would be used to achieve those extra percentages is disproportionate, so what you would get is a card that's going to get a lot hotter, and a lot louder, really fast, while giving you performance you are unlikely to notice, _eh_, performance increase you are unlikely to notice, so really you're looking at kind of like this, the, the, the apex, _eh_, situation here that these cards are already, tuned to, to, the, the optimum situation basically where we can still keep em cool, and they can still give you as much performance as, _eh_, as, _eh_, squeeze out of the card and out of the GPU, as you can get within these parameters, without really blowing the roof off in terms of power usage.
*Peter:* Yeah, it's also bit of a safety feature.
*Pieter:* Yes!
*Peter:* It's, if we do an unlimited slider, there's always somebody who set it to unlimited, and blows up his card, and then he's crying, so yeah.
*Pieter:* Then then, then the GPU dies.
*Peter:* Yeah.
*Pieter:* And that's not a good thing, we are not in the business of harming GPUs here.
*Peter:* No no no, we want the people be happy and play games.
*Pieter:* _Eh_, so they're saying 3 times 8 pin connectors are a waste? _Ehm_, interesting question!
*Peter:* _Eh_, not really, in theory, yes, maybe, _eh_, because, one eight pin power connector can do 150 watts, times two, and _eh_, like 66 watts 12V from the PCI Express slot.
*Pieter:* So you're up to..
*Peter:* 366, so, in theory you can do like, _eh_, two pin, _eh_, two power connectors.
*Pieter:* _Eh_, which our Ventus for example uses.
*Peter:* Yeah but that's not, it's not, _eh_, 340 watt cards, this is a 320 card, a bit more safety margin, but it's really difficult to balance all the power of all the connectors exactly, so you could say this pin can do exactly 150 watts and the other one 150 and the slot 66, it doesn't work that way, so it's more, _eh_, bit of a safety feature to not overload any cables, and not overload your PCI Express slot, I think in some tests I saw PCI Express slot only delivers about like, 50 watts?
*Pieter:* Something like that.
*Peter:* More through the cables, actually that's preferable, I think.
*Pieter:* Well most of the bulk will be drawn through the cables anyway and some very low GPUs, sometimes you will notice it will actually not have any of the six or eight pins, because the, the power draw of the GPU will then be, I don't know, somewhere under 60 for example, 60 watts or 50 watts, and then it can actually draw power from the motherboard from the PCIe slot, to, do the job.

*____*

Absolute insanity going on here. "_Premium/Performance King_" *Gaming X Trio* with a custom designed PCB featuring *16 *possible power stages for the GPU and up to *525W* power limit thanks to the* 3x8-Pin* connectors.. and what do they do? They equip it with *13* power stages for the GPU and a maximum power limit of.. *350..

Founders Edition* which is up to *$150 cheaper*, runs *15* power stages (2 more) for the GPU and a maximum power limit of *370* (20 higher)!

Even better, *ASUS TUF OC* was more than *$100* less than Gaming X Trio in Scandinavia today on release day, and runs full *16* power stages for the GPU and a maximum power limit of *375W *(which is max 2x8-Pin spec). On top of this incredible PCB, it has two additional features: *Dual BIOS* and an extra *HDMI 2.1* connector, as a bonus it's *23mm shorter*, nothing short of an amazing card!

So, Gaming X Trio, 13 power stages for the GPU, 350W max power limit, no Dual BIOS, one of the longest 3080s and all of it at a sweet *premium price*. How about.. no?

They did the exact same mistake on 2080 Ti Gaming X Trio, 330W maximum power limit on 2x8-Pin + 1x6-Pin (450W would be within spec).They simply had issues with every single one of their Ti cards, Lightning had a myriad of issues, Ventus cooler was awful, Duke also had a horribly low power limit (290) just like the Gaming X Trio, far under FE.

We also learned today, EVGA FTW3 Ultra comes with a 375W default and 420W maximum power limit (+12% slider), which also features 3x8-Pin connectors, even though it's not as much as we wanted (450-475), it's .. 70 Watt higher (20%) than Gaming X Trio, FTW3 Ultra is €899 on Caseking.de right now and Gaming X Trio is €839, 7% higher price but 20% higher power limit, also insanely stronger and cooler VRM with up to 19 power stages for the GPU. They're not even comparable.

*Avoid MSI at all costs!*


----------



## HyperMatrix

zhrooms said:


> Absolute insanity going on here. "_Premium/Performance King_" *Gaming X Trio* with a custom designed PCB featuring *16 *possible power stages for the GPU and up to *525W* power limit thanks to the* 3x8-Pin* connectors.. and what do they do? They equip it with *13* power stages for the GPU and a maximum power limit of.. *350..
> 
> Founders Edition* which is up to *$150 cheaper*, runs *15* power stages (2 more) for the GPU and a maximum power limit of *370* (20 higher)!


Wow. I corrected someone in the 3090 thread for mentioning it had 15 or 16 power stages only because I assumed it absolutely had to be a mistake. Looking like a real garbage card. Literally 0 reason to buy it.


----------



## cstkl1

@Jpmboy 
any chance of custom bios bro??


----------



## zhrooms

HyperMatrix said:


> Wow. I corrected someone in the 3090 thread for mentioning it had 15 or 16 power stages only because I assumed it absolutely had to be a mistake. Looking like a real garbage card. Literally 0 reason to buy it.


It's a complete joke, compare this garbage to the TUF OC below which is considerably *cheaper*..










TUF OC










Images from TechPowerUp Reviews


----------



## Alemancio

zhrooms said:


> Even better, *ASUS TUF OC* was more than *$100* less than Gaming X Trio in Scandinavia today on release day, and runs full *16* power stages for the GPU and a maximum power limit of *375W *(which is max 2x8-Pin spec). On top of this incredible PCB, it has two additional features: *Dual BIOS* and an extra *HDMI 2.1* connector, as a bonus it's *23mm shorter*, nothing short of an amazing card!
> 
> 
> *Avoid MSI at all costs!*


I think you mean the TUF OC has 20 power stages? 

Also QQ, what max Watts (realistically speaking) you guys think we could unlock on the TUF? (w/ extra bios)


----------



## pewpewlazer

Alemancio said:


> what max Watts (realistically speaking) you guys think we could unlock on the TUF? (w/ extra bios)


No telling. We don't even know what BIOSes will be available, let alone cross-flashable.
But we already know the TUF can be shunt modded.


----------



## doom26464

So far gist I get is avoid MSI

And asus tuf and gigabyte good

What about zotac? Anyone got any reveiws or info on there cards?


----------



## Alemancio

doom26464 said:


> So far gist I get is avoid MSI
> 
> And asus tuf and gigabyte good
> 
> What about zotac? Anyone got any reveiws or info on there cards?


Historically speaking (1080 Ti, 2080 Ti) Zotac abuses of adding many power stages but of not really good quality, resulting in too much circuitry and no real extra performance. Their approach to this has lost me as a client.


----------



## Imprezzion

The e-tailer I bought my MSI from that couldn't ship it has Inno3D models (ichill x3 and x4) in stock now but for a massively inflated price of like €999-1029. 

I sent them an email whether they wanna send me a x3 or x4 for the same price I bought the MSI Gaming X Trio.

Let's see what they say about that lol.


----------



## Sebash

Hello everybody,

I managed to preorder in Poland these 5x cards: So prices are different unfortunately (with shipment to PL).
I will return them of course because I can do that, but I bought all because I am perfectionist.
I wanted to buy Nvidia FE like almost all people but couldn't do that. Esspecialy I found out that FE is best with boost clock, avg fps, min fps, 1% low fps, which is insane. I don't know If i should still try to get one FE or give up and keep one of this 5 cards.

The price in my country for *Founders Edition is 870$* (just to show you the difference compare to bellow custom cards I will have)

*So I preorder this one:*
1) *RTX 3080 Asus TUF *for 900$ (Poland...)
2) *RTX 3080 Asus TUF OC* for 928$
Both with 3 years warranty, good shop - (proshop) - I heard these are with good cooling system, but they are worst than FE if it comes to boost clock/overlock, unforunately...

Also Im not sure which one is better, as I can understand the version with OC is overlocked, but version without OC can be manualy overlocked (So i can safe my money?) - because for sure I will want to OC any card, so why should I pay for OC card. *UNLESS the OC means better overclock potential.*

Also does the Asus TUF has some good bracket to hold the gpu? Or its nicely done so you dont need any bracket? like msi gives u with the card?

3) *RTX Gigabyte Eagle OC *for 870$ - 3+1 years warranty - don't know nothing about this card yet, bought it only because of same price like FE and 4 years warranty

4) *2x RTX MSI VENTUS 3X OC* for 870$ (two because from one shop I bought 1x card early and next day I found another shop with same card and with 3 years warranty)
Bought these because MSI adds free mount bracket to dont bend the PCI Slot - which is veryyyy nice. Also I like the design, compare to Gigabyte Eagle.

Of course I do not want overpay more than 900$ for like ROG, Extreme or Trio X. I just want decent card who will not break, with good oc potential like FE.


----------



## acoustic

How did you manage to get 5 cards ..


----------



## Sebash

acoustic said:


> How did you manage to get 5 cards ..


I was prepared before 15:00 CET and pretty much just "preorder" them like all people here in our country.
The sooner I will get is ASUS TUF (version without OC) because the guy on the phone told me I was in the first 50 people who order this card, and then I also bought the OC version which I am in first 100 people who also order this card. 
(Of course there are no shop who has these cards and they need to wait like 2 weeks to get them first, but if they get them they will ship them all to people - chronological)
There were only 20x cards who got already shipped to people, I was unlucky but still im happy im in first 50, at least. I have time now to think, watch benchmarks, etc.

These two Asus cards I will get probably the sooner.

The MSI and Gigabyte I found out on other sites as I mention, so these one I will pretty much get in 4-5 weeks maybe even later. But at least I will not need to wait for refill on nvidia site for FE cards.


----------



## acoustic

Seems like people outside of the US and Canada faired much better. I don't really know of any places that are taking "pre-orders" like some of the shops are doing for you


----------



## Sebash

acoustic said:


> Seems like people outside of the US and Canada faired much better. I don't really know of any places that are taking "pre-orders" like some of the shops are doing for you


For me its normal. I phone call to every shop and they pretty much told me that they have only 20 cards of each mark (like asus, msi, gigabyte) and If i will manage to be in first 20 the card is mine and I will get it in few days, but if not I still can buy it for pre-order (lowest) price but will need to wait a bit for refill, and the shipping to people will be chronological.

You can still try to buy some cards on polish sites but of course you need to know that you will be like 2000 in line.


Im playing at 240Hz /1080p so Im happy with my 1080ti but I wanted to do upgrade since few years (I skipped 20 series).


----------



## Imprezzion

TUF OC 100%. Best PCB, best BIOS power limit and higher binned chip as it's factory overclocked.


----------



## Sebash

Imprezzion said:


> TUF OC 100%. Best PCB, best BIOS power limit and higher binned chip as it's factory overclocked.


Yea but isnt version without OC same but you can manually overlock it without overpaying? thats my point  and how to check it if so?


----------



## sakete

Imprezzion said:


> TUF OC 100%. Best PCB, best BIOS power limit and higher binned chip as it's factory overclocked.


Factory OC has nothing to do with binning.


----------



## BigMack70

acoustic said:


> Seems like people outside of the US and Canada faired much better. I don't really know of any places that are taking "pre-orders" like some of the shops are doing for you


My impression as well. Seems like the scams such as bounce alerts are mostly operating in the US.


----------



## Sebash

sakete said:


> Factory OC has nothing to do with binning.


so pretty much version without OC can be same like OC version if manually overlocked?


----------



## sakete

Sebash said:


> so pretty much version without OC can be same like OC version if manually overlocked?


Yep. Binned chips you'll typically only see on top of the line cards, like the EVGA Kingpin. Otherwise it's just luck of the draw.


----------



## Sebash

sakete said:


> Yep. Binned chips you'll typically only see on top of the line cards, like the EVGA Kingpin. Otherwise it's just luck of the draw.


That would be cool, because thats literally what I expected, its all marketing.
Like If I were selling Nvidia FE cards for 699$ and Nvidia FE OC (by myself) for 740$ lol


----------



## cstkl1

the card is severely temp/powerlimited

it boost 2145-2160 but sustain u be lucky in gaming 19xx

















vram oc kindda silly +1250


----------



## Sebash

cstkl1 said:


> the card is severely temp/powerlimited
> 
> it boost 2145-2160 but sustain u be lucky in gaming 19xx


u mean both Asus TUF and asus TUF OC? since they are the same with manually OC?


----------



## cstkl1

Sebash said:


> u mean both Asus TUF and asus TUF OC? since they are the same with manually OC?


dont bother with da oc crap. it same as turing

asic boost is what counts.


----------



## Sebash

cstkl1 said:


> dont bother with da oc crap. it same as turing
> 
> asic boost is what counts.


what? ;s


----------



## man from atlantis

cstkl1 said:


> dont bother with da oc crap. it same as turing
> 
> asic boost is what counts.


Sorry if you already answered this, but which brand model is your card?


----------



## cstkl1

man from atlantis said:


> Sorry if you already answered this, but which brand model is your card?


tuf 3080.. 

seriously guys the card so bottleneck either by temp, powerlimit or cpu at 1440p.. theres no point talking about what card does what.. 
from my calculation.. even if put this in water i need like a 450-500 watt bios to sustain this

dont care about your tuf .. trinity etc etc...

the card is super bottlenecked.. it can actually boost to 2130-2160.. 
but cannot sustain it.. 

power limit



Spoiler



   



but finally a card that can run this game at 144fps min.. at the game setting.

seeing this.. hmm 3090.. hmmm was planning to get strix 3080/3090...


----------



## man from atlantis

cstkl1 said:


> tuf 3080..
> 
> seriously guys the card so bottleneck either by temp, powerlimit or cpu at 1440p.. theres no point talking about what card does what..
> from my calculation.. even if put this in water i need like a 450-500 watt bios to sustain this
> 
> dont care about your tuf .. trinity etc etc...
> 
> the card is super bottlenecked.. it can actually boost to 2130-2160..
> but cannot sustain it..
> 
> power limit
> 
> 
> 
> Spoiler
> 
> 
> 
> 
> 
> 
> 
> but finally a card that can run this game at 144fps min.. at the game setting.
> 
> seeing this.. hmm 3090.. hmmm was planning to get strix 3080/3090...


Thanks, is the power limit same as the OC version 375W?


----------



## cstkl1

man from atlantis said:


> Thanks, is the power limit same as the OC version 375W?


yes..

ok maybe u guys didnt have multiple turing cards.. i had.. 
the prebin oc doenst matter.. its the gpu binned asic boost.. 
i had ref card that boost way higher than oced cards..
its the same here.. so forget about oc clock etc.. its just down to asic lucky draw just like turing..
this card when its not being stress can boost up to 2160.. vram easy 21.5gbps...

benching time to proove a point in a bit...
my bench windows cant load msi ab for some reason..


----------



## Nizzen

I got Palit 3080 pro oc. It' already sendt 
A few days benchmarking before 3090 hopefully is shipping


----------



## long2905

TUF is the cheapest in my country right now. and i have to wait until Sep 26th, worth it? Got a free backpack though


----------



## cstkl1

Nizzen said:


> I got Palit 3080 pro oc. It' already sendt
> A few days benchmarking before 3090 hopefully is shipping


hope the asic god bless ya.that cpu/ram of yours need to do some streching. lol

bitspower block so far strix/tuf 3080/3090 the same so gonna assume 3080s/3080ti will be too. so just gonns buy one tuf and strix block. 3-4 weeks eta release date. 
he


----------



## cstkl1

pricing


----------



## Imprezzion

So, what you guy's are saying right now is that no matter what model we get, it's going to power throttle like mad? I mean, at least a 2080 Ti FTW3 Ultra for example had enough power limits to not throttle at all at ~2150Mhz 1.093v...

Then how do you even dail in a manual OC when the card boosts to totally different clocks in every load..


----------



## Talon2016

Right now my 3080 XC3 is severely power limited and the power slider does nothing. No response from EVGA, capped at about 320w with small peaks slightly above. Updated to new X1 and it did a firmware update and no fix. Pretty disappointing to see it so severely limited that it can't maintain 2000Mhz under load. 4K loads for me are around 1800-1900Mhz, it's like having a max-q 3080. Luckily I purchased from Microcenter. But the card does run great at stock and performance bump is nice, but it could be nicer if it wasn't so limited.

Edit: GPU-Z vBIOS reads it should be 340w with max adjustable to 366w.


----------



## shallow_

shallow_ said:


> I have ordered an MSI Gaming X Trio from proshop.no, and it was listed in stock before AND after my payment clearing. ETA 4-5 days I think.


Well this went from Yes ?? to Yes!!! to Yes?!?!? to NO!!!! real quick 

Was sure I had snagged one, called seller an hour later and got phone confirmation my order was in for one of the in stock cards. Then this morning i had been changed to 'back ordered' and when I called I got confirmation of backorder status 

Still have my Asus Strix OC card on preorder from another website, and according to them my order should be among the first to go out around oct 2. Not celebrating anything until in hand though.

What happened to you guys who had B and H preorders, did any ship ?


----------



## acoustic

Talon2016 said:


> Right now my 3080 XC3 is severely power limited and the power slider does nothing. No response from EVGA, capped at about 320w with small peaks slightly above. Updated to new X1 and it did a firmware update and no fix. Pretty disappointing to see it so severely limited that it can't maintain 2000Mhz under load. 4K loads for me are around 1800-1900Mhz, it's like having a max-q 3080. Luckily I purchased from Microcenter. But the card does run great at stock and performance bump is nice, but it could be nicer if it wasn't so limited.
> 
> Edit: GPU-Z vBIOS reads it should be 340w with max adjustable to 366w.


Is there a BIOS switch on the XC3?


----------



## Johneey

shallow_ said:


> Well this went from Yes ?? to Yes!!! to Yes?!?!? to NO!!!! real quick
> 
> Was sure I had snagged one, called seller an hour later and got phone confirmation my order was in for one of the in stock cards. Then this morning i had been changed to 'back ordered' and when I called I got confirmation of backorder status
> 
> Still have my Asus Strix OC card on preorder from another website, and according to them my order should be among the first to go out around oct 2. Not celebrating anything until in hand though.
> 
> What happened to you guys who had B and H preorders, did any ship ?


where u buy the asus strix rog? when u buy on amazon u will get a cancel. i get too


----------



## Johneey

acoustic said:


> Is there a BIOS switch on the XC3?


same problem on the ventus 3x oc... u cannot go higher as 320 watts the slider is max 100%... i hope this is a bug.. otherwise its a big scam again! ...


----------



## shallow_

Johneey said:


> where u buy the asus strix rog? when u buy on amazon u will get a cancel. i get too


I have ordered from komplett.no, probably the largest e-tailer in scandinavia.

Amazon is not competitive on computer hardware for us up north


----------



## Mooncheese

Shunt modding GA102 doesn't look promising.







www.overclock.net


----------



## KingEngineRevUp

BIOs Flashing! BIOs Flashing! GO! GO! GO!


----------



## Webster200x

Hi, managed today to crack top 25 on TIME SPY EXTREME as the cards just came out i presume it will take a while until we get some custom bios for them.


https://www.3dmark.com/spy/13941693


----------



## Johneey

KingEngineRevUp said:


> BIOs Flashing! BIOs Flashing! GO! GO! GO!


Will it be possible ?


----------



## KingEngineRevUp

Johneey said:


> Will it be possible ?


Not sure. I imagine it will be hard since we don't even know which cards are reference. Custom to custom card flashing is pretty ****ty from my experience. It was the reference cards that made flashing nice.


----------



## Sebash

So basically both Asus TUF OC and non OC can be up to 375W, but the card is still power limited??
FE is also good because 370W, but not so good cooling as quiet asus TUF.
MSI Ventus 3x OC is only 320 W Limited, trio at 350W
gigabyte gaming OC is what I heard one of the best but too much expensive compare to asus/fe/eagle oc/
What about gigabyte EAGLE OC?

Maybe its all about drivers? Why the cards would be power limited??? don't get it.


----------



## cstkl1

my verdict on the card after gaming on it for few hours
tuf 3080 oced +120, vram +1000
game vermintide 2
finally max out graphics fps min [email protected]

the card still not fully utilize as cpu bottlenecked

clock its severely powerlimited and thermal 68c max i have seen even in 3dmark timespy stresstest.

gpuz recorded max wattage of 350. eventhough i set max pl 117% which is 375.,

its boosting easily up to 2160 and if i limit the fps to 144., it will stay at 2130-2160. so it should be stable but powerlimited.

should ppl upgrade . yes. reviewers are dumb. steve @ gn is not a gamer.

1. tuf is the cheapest card. with 2080ti strix level of pcb and free watch doggy legion.
2. the temps/noise/build quality of tuf is superbly good. best thing its 29.9cm
3. even at severe throttling its still outpacing previous gen. so to solve this. ..in the beginning i thought crazy fo wc this card cause got to spend usd 250 ...1/3rd of the card value. but now change my mind. strix atleast usd 200 more and not gonna give u 15% more fps. watercooling i think just might do it. might need a 400watt bios .

get any tuf u can get . asus nvr bin gpus like ever.

btw with the latest AB i can tell ya. most lf the time the card running under 1v due to throttling.


----------



## KingEngineRevUp

Sebash said:


> gigabyte gaming OC is what I heard one of the best but too much expensive compare to asus/fe/eagle oc/


Gigabyte gaming is 360W power limit. 

But honestly, these cards just aren't going to OC like 20 series. Each generation, boost evolves. We're on boost 5.0 now and it's doing the OC for us. I imagine if we all did this manually on our own, we'd be jumping for joy.


----------



## BigMack70

KingEngineRevUp said:


> Each generation, boost evolves. We're on boost 5.0 now and it's doing the OC for us. I imagine if we all did this manually on our own, we'd be jumping for joy.


That's one way to look at it, I suppose. I'm more pessimistic: this is the third generation where Nvidia has essentially failed to increase clock speeds. We've been locked at the same ~1.9-2.1 GHz frequency since Pascal with no ability to push past it except for exotic cooling. 

I hope this doesn't become a "wall" like 5 GHz has been for CPUs.


----------



## Talon2016

cstkl1 said:


> my verdict on the card after gaming on it for few hours
> tuf 3080 oced +120, vram +1000
> game vermintide 2
> finally max out graphics fps min [email protected]
> 
> the card still not fully utilize as cpu bottlenecked
> 
> clock its severely powerlimited and thermal 68c max i have seen even in 3dmark timespy stresstest.
> 
> gpuz recorded max wattage of 350. eventhough i set max pl 117% which is 375.,
> 
> its boosting easily up to 2160 and if i limit the fps to 144., it will stay at 2130-2160. so it should be stable but powerlimited.
> 
> should ppl upgrade . yes. reviewers are dumb. steve @ gn is not a gamer.
> 
> 1. tuf is the cheapest card. with 2080ti strix level of pcb and free watch doggy legion.
> 2. the temps/noise/build quality of tuf is superbly good. best thing its 29.9cm
> 3. even at severe throttling its still outpacing previous gen. so to solve this. ..in the beginning i thought crazy fo wc this card cause got to spend usd 250 ...1/3rd of the card value. but now change my mind. strix atleast usd 200 more and not gonna give u 15% more fps. watercooling i think just might do it. might need a 400watt bios .
> 
> get any tuf u can get . asus nvr bin gpus like ever.
> 
> btw with the latest AB i can tell ya. most lf the time the card running under 1v due to throttling.


Will you please upload your TUF vBIOS to google drive or something, would like to give it a try.


----------



## sblantipodi

guys I know that this is something OLD, but that it have sense in buying a 4K card with 10GB of VRAM?
will we see a 3080Ti with 16GB of VRAM on Christmas?


----------



## Imprezzion

sblantipodi said:


> guys I know that this is something OLD, but that it have sense in buying a 4K card with 10GB of VRAM?
> will we see a 3080Ti with 16GB of VRAM on Christmas?


Not 16GB. 20GB is all but confirmed already. Multiple sources list this. Might just be a 3080 20GB, might be a "3080 super" but it will come.


----------



## sblantipodi

damn, what a stupid market.


----------



## Falkentyne

cstkl1 said:


> yes..
> 
> ok maybe u guys didnt have multiple turing cards.. i had..
> the prebin oc doenst matter.. its the gpu binned asic boost..
> i had ref card that boost way higher than oced cards..
> its the same here.. so forget about oc clock etc.. its just down to asic lucky draw just like turing..
> this card when its not being stress can boost up to 2160.. vram easy 21.5gbps...
> 
> benching time to proove a point in a bit...
> my bench windows cant load msi ab for some reason..


Did you update?









Download: MSI Afterburner 4.6.3 Beta 2


We've been hard at work, but the new beta of AfterBurner has been wrapped up. We have a long changelist to present you with Ampere support of course, but also want to introduce a twofold of new skins...




www.guru3d.com


----------



## Alemancio

Sebash said:


> So basically both Asus TUF OC and non OC can be up to 375W, but the card is still power limited??
> FE is also good because 370W, but not so good cooling as quiet asus TUF.


Exactly, think of it like a car with good/bad motor/tyres. If you have a good motor but bad tyres (temps), your motor wont max out (power). The way around also applies, if you have great tyres (good temps) but a bad motor (limited power) you wont go fast either. You personally can control temps (by going WC'd for example) but if your power delivery is bad (say 320W) you won't get as far. 

As I see it, if you manage to get 60C load you could push for 400W (EVGA FTW3?) for stable 2.1GHz under load if the GPU even reaches it.



cstkl1 said:


> btw with the latest AB i can tell ya. most lf the time the card running under 1v due to throttling.


Why dont you curve the V/F, undervolt it, free up some watts and temps to push for sustained loads?



KingEngineRevUp said:


> Gigabyte gaming is 360W power limit.
> But honestly, these cards just aren't going to OC like 20 series. Each generation, boost evolves. We're on boost 5.0 now and it's doing the OC for us. I imagine if we all did this manually on our own, we'd be jumping for joy.


Kinda odd Gigabyte went for 360W when ref is 375W, no?
Agreed on the Boost5.0 is like AMD's PBO, you wont get MUCH further by manually OC'ing.



BigMack70 said:


> That's one way to look at it, I suppose. I'm more pessimistic: this is the third generation where Nvidia has essentially failed to increase clock speeds. We've been locked at the same ~1.9-2.1 GHz.


Agreed but understand something, 2Ghz might be a real barrier where efficiency drops sharply given the architecture and transistor density. There's a reason AMD hasn't hit 5GHz for example. So companies actually rather work on efficiency and improving IPC. Computers dont measure by GHz but our minds were taught that way (specially if you grew during the 80~2000's MHz crazyness).


----------



## sakete

Should I open it?


----------



## HyperMatrix

sakete said:


> Should I open it?


I dunno man. If I were you I'd sell it for $25k like this guy: EVGA GeForce RTX 3080 XC3 ULTRA - OFFERS ONLY | eBay

But seriously though you could probably get an extra $400--$500 over your purchase price if you don't need the card or would prefer to get a higher end model.


----------



## sakete

HyperMatrix said:


> I dunno man. If I were you I'd sell it for $25k like this guy: EVGA GeForce RTX 3080 XC3 ULTRA - OFFERS ONLY | eBay
> 
> But seriously though you could probably get an extra $400--$500 over your purchase price if you don't need the card or would prefer to get a higher end model.


Well, I already opened it and am planning on sending it to Optimus so they can design a waterblock for it. So that will be at least one confirmed waterblock for the XC3, which is close to reference, but not exactly.

I really want an FTW3, and if I can get my hands on that I'll sell the XC3 for what it cost me (cost + tax + shipping).

If I can't get my hands on the FTW3 within a reasonable time frame (say the next week or two), I'll just keep the XC3 as I wasn't planning on doing any crazy OC'ing anyway.


----------



## Nico67

cstkl1 said:


> gpuz recorded max wattage of 350. eventhough i set max pl 117% which is 375.,


Saw the same thing on [email protected] 's live stream, power slider made no difference to power draw limit on the TUF. It stayed suck at 340w, maybe with a few higher spikes than before, but compare to the FTW3 which clearly went from 380 to 400w with the slider change, the TUF did nothing.


----------



## VPII

Is there any chance we will get modified bios's for these 3080 cards or is that a no no due to it being locked?


----------



## HyperMatrix

VPII said:


> Is there any chance we will get modified bios's for these 3080 cards or is that a no no due to it being locked?


Flashing is doable. Custom bios exists. Will they end up in our hands? That's not known yet. But if you're interested in doing that, make sure you get a card that can handle the extra power. This generation is even more sensitive that the last. You need probably 450W for a _stable_ 2.1GHz OC under water. Some of the cheaper boards we've seen have components that are likely to blow up at 450W and above.


----------



## cstkl1

Nico67 said:


> Saw the same thing on [email protected] 's live stream, power slider made no difference to power draw limit on the TUF. It stayed suck at 340w, maybe with a few higher spikes than before, but compare to the FTW3 which clearly went from 380 to 400w with the slider change, the TUF did nothing.


it works in ab.. steve be stupid using EVGA...
AB 4.6.2 also works.. just the voltage control was grey out which does nothing on asus tuf.. 
but the beta 4.6.3 B2 that option option..


----------



## cstkl1

Alemancio said:


> Exactly, think of it like a car with good/bad motor/tyres. If you have a good motor but bad tyres (temps), your motor wont max out (power). The way around also applies, if you have great tyres (good temps) but a bad motor (limited power) you wont go fast either. You personally can control temps (by going WC'd for example) but if your power delivery is bad (say 320W) you won't get as far.
> 
> As I see it, if you manage to get 60C load you could push for 400W (EVGA FTW3?) for stable 2.1GHz under load if the GPU even reaches it.
> 
> 
> Why dont you curve the V/F, undervolt it, free up some watts and temps to push for sustained loads?
> 
> 
> Kinda odd Gigabyte went for 360W when ref is 375W, no?
> Agreed on the Boost5.0 is like AMD's PBO, you wont get MUCH further by manually OC'ing.
> 
> 
> Agreed but understand something, 2Ghz might be a real barrier where efficiency drops sharply given the architecture and transistor density. There's a reason AMD hasn't hit 5GHz for example. So companies actually rather work on efficiency and improving IPC. Computers dont measure by GHz but our minds were taught that way (specially if you grew during the 80~2000's MHz crazyness).


v/f oc is worth it when you are not hitting powerlimit... cause then its only temp that changing the curve..


----------



## cstkl1

this is seriously how the game was meant to be seen.. 1440p maxed out.. 144hz smooth..
first gpu that i can max out settings @ 1440p and also the first i had to bump the cpu to 5.3..
didnt bother to oc the gpu and left it at 2nd bios quiet mode.


----------



## Vapochilled

add me to the owners list.
I got the cheap Gigabyte 3080 Eagle OC.
I think it max out at 340W. Need to check. Still in the box. 
I had my tonsils removed few days ago.. bahh. Feeling better today and planning to mount it now in the morning.

I believe this generation is almost maxed out from stock.
I'll be happy to have the AB voltage curve working and maybe ... why not ... undervolt a little to get more stable clocks.

People tend to look a lot to highest clocks and higher avg fps but you should articles from GamerNexus where is clear that these situations end up creating latency.
The fact that the GPU needs constatly to readjust clock and voltage will create that latency.


----------



## Imprezzion

I kinda wanted a Eagle as it looked great initially. Share your clock and power limits findings please . If they are good I'll get one as soon as they go in stock even tho I prefer a EVGA or ASUS TUF this time.


----------



## Sebash

cstkl1 said:


> this is seriously how the game was meant to be seen.. 1440p maxed out.. 144hz smooth..
> first gpu that i can max out settings @ 1440p and also the first i had to bump the cpu to 5.3..
> didnt bother to oc the gpu and left it at 2nd bios quiet mode.


Can you test how many fps can you get at 1080p?
I need to know If i need really switch my e-sport monitor 240hz to something like 1440p 165hz :| I would not prefer to thought... i play cod mw (have now 1080ti)
8700k 4,8Ghz.


----------



## Vapochilled

Imprezzion said:


> I kinda wanted a Eagle as it looked great initially. Share your clock and power limits findings please . If they are good I'll get one as soon as they go in stock even tho I prefer a EVGA or ASUS TUF this time.



Testing the new MSI AB 4.6.3 Beta 2...
Powerslide is locked at 100% .... 
And searching for AORUS engine for tweak... there is nothing under download... 

Ideas ??


----------



## Nico67

cstkl1 said:


> it works in ab.. steve be stupid using EVGA...
> AB 4.6.2 also works.. just the voltage control was grey out which does nothing on asus tuf..
> but the beta 4.6.3 B2 that option option..


May not be great, but debauer said Asus tweak to 110 did nothing also, and you could see it stuck on 340w in GPU-Z again. Could just be a bug but I am always skeptic when nothing changes. Could be its limiting on single rail over limit or something weird like that too, so 110 total doesn't matter, hope I am wrong. Just wish things clearly did what they say there doing.


----------



## cstkl1

Nico67 said:


> May not be great, but debauer said Asus tweak to 110 did nothing also, and you could see it stuck on 340w in GPU-Z again. Could just be a bug but I am always skeptic when nothing changes. Could be its limiting on single rail over limit or something weird like that too, so 110 total doesn't matter, hope I am wrong. Just wish things clearly did what they say there doing.


dat another village idiot with stevo plus the rambling fool completes a trio.... ignore them. 






typical gameplay. powerlimit, cpu bottlenecked.. but still pushing frames nvr seen before while nvenc recording this at 4:4:4 ..
this with quiet mode.. with cpu rad fans on low. was trying to listen for coil whines.. none..

dats 110.. max is 117 but no point
da bios is limited to both pcie 8 [email protected] watt and pcie slot at just under 60. gpuz sensor shows da max. it spikes to 118

so the card max is 355-360 watt. 

a noob with heave window mode+ab+ gpuz sensor mode can see it clearly.


----------



## DA_Maverick_AD

Hi veterans....new to this forum, and just got myself an MSI Gaming X Trio 3080 pre-order (ETA 5-Oct). Am particularly curious about its OC capability given I just paid US$100 above MSRP for this card (hoping for better than FE performance; 3 8-pin seems useless though?). 

Am a bit disappointed looking at Hardware unboxed review of the model, but saw Guru3D and others were giving the Trio high marks. Am confused if I should cancel the Trio pre-order and try to get a cheaper AIB. Note we don't have FE selling in Australia so only AIBs. 

Looking forward to your views.


----------



## Sonac

Imprezzion said:


> TUF OC 100%. Best PCB, best BIOS power limit and higher binned chip as it's factory overclocked.


Best cheap PCB yes. FTW3/KINGPIN PCB still absolutely slaughters it. But those are all $100 more at least. So makes sense. Out of all the cheap and even midrange cards TUF OC is best in my opinion.


----------



## Sebash

Sonac said:


> Best cheap PCB yes. FTW3/KINGPIN PCB still absolutely slaughters it. But those are all $100 more at least. So makes sense. Out of all the cheap and even midrange cards TUF OC is best in my opinion.


You compare them both? 
Because TUF non OC is the same I heard... But just need to be manually overlocked


----------



## Nammi

DA_Maverick_AD said:


> Hi veterans....new to this forum, and just got myself an MSI Gaming X Trio 3080 pre-order (ETA 5-Oct). Am particularly curious about its OC capability given I just paid US$100 above MSRP for this card (hoping for better than FE performance; 3 8-pin seems useless though?).
> 
> Am a bit disappointed looking at Hardware unboxed review of the model, but saw Guru3D and others were giving the Trio high marks. Am confused if I should cancel the Trio pre-order and try to get a cheaper AIB. Note we don't have FE selling in Australia so only AIBs.
> 
> Looking forward to your views.


350W PL and a lackluster pcb, plus the chunky price tag. It's one of the cards that should be on the avoid list for everyone, especially if you're looking to OC...


----------



## Imprezzion

Ignore my comment about the TUF OC being pre-binned. My mind got stuck on the whole A chip vs non-A chip of the 2080 Ti.

I made up my mind. Unless I can get one for like, €720 or less I will just wait for either a FE to come for sale on nVidia's website again or a EVGA FTW3 version (Ultra or the Hybrid) will become available somehow either direct from EVGA or from a retailer.


----------



## sjd

DA_Maverick_AD said:


> Hi veterans....new to this forum, and just got myself an MSI Gaming X Trio 3080 pre-order (ETA 5-Oct). Am particularly curious about its OC capability given I just paid US$100 above MSRP for this card (hoping for better than FE performance; 3 8-pin seems useless though?).
> 
> Am a bit disappointed looking at Hardware unboxed review of the model, but saw Guru3D and others were giving the Trio high marks. Am confused if I should cancel the Trio pre-order and try to get a cheaper AIB. Note we don't have FE selling in Australia so only AIBs.
> 
> Looking forward to your views.


So far it's looking pretty disappointing but I'd wait and see since 5 October is two weeks out still. The TUF is definitely the most bang for buck option right now with the launch discount price, but it's possible that you won't be getting your hands on one for a month or two if you order right now. I've got both the TUF and the Gaming X Trio on order, guessing the latter will arrive next week or so. I'll wait and see if someone reputable gets their hand on it to look at it more closely, because even if the PCB and PL is worse than the TUF there might not be that much of a difference for the more casual OCer.

The sustained power consumption running Furmark according to Techpowerup was 425 W for the Gaming X Trio, while the TUF OC running the same test was 405 W. FE was 370 W. Perhaps the PL listed in the BIOS for these cards isn't accurate? I must say I'm not the right guy to try and figure out what these numbers mean, maybe it just shows higher inefficiency for the AIBs.


----------



## Vapochilled

Loosing my mind ....

F1 2019 doesnt work with 3080... crashes !! If i put back the 1080 Ti works fine. Waiting for a support agent for 45min, when it said 5min... Good AI here on the prediction...
Benchs are stable and fine. 

Also, i have nothing but the gigabyte card.... however, it was mentioned that i would have a geforce now 1 year subscrpt + game... no instructions to redeem or so..

These bastards did a pure paper lunch. Doesnt seem to be ready ... i wonder how many more games will crash ..


----------



## cstkl1

btw this not my highest stable but chosed this as i think this the norm..


----------



## padman

Vapochilled said:


> F1 2019 doesnt work with 3080... crashes !!


Gamers Nexus had the same problem with F1 2019. It's not your cards fault. Why would you call support for?


----------



## gerardfraser

It seems a lot of games do not work with the RTX 3080,shame real shame.Well at least the RTX 3080 has HDMI 2.1,that a good thing and no overclocking on the cards is a cluster FK. Soon no overclocking on any CPU/GPU they are pretty much maxed out when released.


----------



## skline00

cstkl1: Thank you for your posts. Good to see a poster who actually has one.
How is your 10900k cooled? Is it OC'd?

What gpu did your RTX 3080 replace?


----------



## cstkl1

skline00 said:


> cstkl1: Thank you for your posts. Good to see a poster who actually has one.
> How is your 10900k cooled? Is it OC'd?
> 
> What gpu did your RTX 3080 replace?


WC and oced.. daily its 51 but i can run 53 but hmm not too keen on it .. the other day was just to see whether it will help with cpu bottleneck.. it didnt..

RTX 2080ti MSI TRIO watercooled bios modded 
and the other one is Asus Strix 2080ti.. this now.. hmm looks like 3090 not worth it..


----------



## TwinParadox

There seems to be some issues dumping bios on RTX 3xxx cards. With NVFlash or GPU-Z it even doesn't detect the VGA adapter.


----------



## t1337dude

Nammi said:


> 350W PL and a lackluster pcb, plus the chunky price tag. It's one of the cards that should be on the avoid list for everyone, especially if you're looking to OC...


Well let me flip the script here. Nobody should buying these cards thinking they will get great gains from OC. It will be a fruitless endeavor to most - these cards are already pushed close to their limits out of the box. The MSI Gaming Card runs COOLER and significantly QUIETER than the other AIB's, and thus can better sustain boost clocks. Compare those *significant* attributes to squeezing a few extra MHz out of your card and maybe you'll re-think things.


----------



## sakete

t1337dude said:


> Well let me flip the script here. Nobody should buying these cards thinking they will get great gains from OC. It will be a fruitless endeavor to most - these cards are already pushed close to their limits out of the box. The MSI Gaming Card runs COOLER and significantly QUIETER than the other AIB's, and thus can better sustain boost clocks. Compare those *significant* attributes to squeezing a few extra MHz out of your card and maybe you'll re-think things.


So essentially, slap on a waterblock, and you're set. Wonder if it's even worth it to go the custom PCB route such as FTW3 or Strix. Currently have the EVGA 3080 XC3, but really wanted the FTW3. But not sure if it's worth the hassle at this point. I will be slapping on a waterblock either way as the rest of my system is a custom loop.


----------



## skline00

Thank you for the info cstkl1


----------



## cstkl1

sakete said:


> So essentially, slap on a waterblock, and you're set. Wonder if it's even worth it to go the custom PCB route such as FTW3 or Strix. Currently have the EVGA 3080 XC3, but really wanted the FTW3. But not sure if it's worth the hassle at this point. I will be slapping on a waterblock either way as the rest of my system is a custom loop.


that my dilemma..
a good waterblock with backplate is atleast 250 usd with shipping.. thats like 1/3rd of the card..
adding on the card price.. we might be close to a 3080super or a 3080ti variant.. since theres a gap of usd 700 between 3080 and 3090...

cause kindda dumb if suddenly a card like 450 watt unlock gigabyte 3080 super aurous extreme.. wb edition is only usd 900..... strix 3080 super poseidon.. etc...

but no way atm will get a strix 3080 or equivalent and get a waterblock for it.. cause the next tier 3080ti is looming around the corner.


----------



## t1337dude

sakete said:


> So essentially, slap on a waterblock, and you're set. Wonder if it's even worth it to go the custom PCB route such as FTW3 or Strix. Currently have the EVGA 3080 XC3, but really wanted the FTW3. But not sure if it's worth the hassle at this point. I will be slapping on a waterblock either way as the rest of my system is a custom loop.


"slap on a waterblock"

That's fine if you want to spend a bunch of extra money and put in a ton of extra effort, for not-so-great gains. It's the enthusiast thing to do.

Anyways I mispoke when I said the Gaming X Trio runs cooler, because it actually runs a tad on the warm side. Having said that, let's look at the end result...










Same 4k performance in Tomb Raider, yet the MSI Gaming X is doing it at 3dBA quieter (which is fairly significant). The TUF Mode gets around those noise levels with the down-clocked quiet BIOS, but you wouldn't be seeing it match the MSI Gaming Trio like it is above. 

If getting every measly MHz for that 1-2 FPS gains is super important to you, then you're probably going to put a water block on it anyways. If you're not going to those lengths, then the MSI Gaming Trio is certainly competitive, or at least more competitive than what people seem to be saying about it.


----------



## Talon2016

TwinParadox said:


> There seems to be some issues dumping bios on RTX 3xxx cards. With NVFlash or GPU-Z it even doesn't detect the VGA adapter.


Having the same issue with GPU-Z. Can't read the vBIOS. Cool.


----------



## TwinParadox

Talon2016 said:


> Having the same issue with GPU-Z. Can't read the vBIOS. Cool.


GPU-Z embeds a specific NVFlash version, thus an updated one is required in order to detect adapter and properly dump bios rom.


----------



## sakete

t1337dude said:


> "slap on a waterblock"
> 
> That's fine if you want to spend a bunch of extra money and put in a ton of extra effort, for not-so-great gains. It's the enthusiast thing to do.
> 
> Anyways I mispoke when I said the Gaming X Trio runs cooler, because it actually runs a tad on the warm side. Having said that, let's look at the end result...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Same 4k performance in Tomb Raider, yet the MSI Gaming X is doing it at 3dBA quieter (which is fairly significant). The TUF Mode gets around those noise levels with the down-clocked quiet BIOS, but you wouldn't be seeing it match the MSI Gaming Trio like it is above.
> 
> If getting every measly MHz for that 1-2 FPS gains is super important to you, then you're probably going to put a water block on it anyways. If you're not going to those lengths, then the MSI Gaming Trio is certainly competitive, or at least more competitive than what people seem to be saying about it.


I'm getting my block for free


----------



## Vapochilled

gerardfraser said:


> It seems a lot of games do not work with the RTX 3080,shame real shame.Well at least the RTX 3080 has HDMI 2.1,that a good thing and no overclocking on the cards is a cluster FK. Soon no overclocking on any CPU/GPU they are pretty much maxed out when released.


Yup! Horrible rush release. They wanted to go ahead of AMD and publish 3080 with the 1000 cards they had worldwide ...
Then they fake out of stock in 15 secs.. and now games dont run unless its the games reviewers and Nvidia benched ..

Feeling stupid here since i cant play the 2 games i want
I already create them a ticket


----------



## Rbk_3

Was able to get the Gigabyte Gaming OC. Couple things.
I got 4k 120 to work on my LG C9 but I can’t get Gsync to work with it. The screen goes black when I launch a game.

Also getting some frametime issues in Warzone that I wasn’t getting with my 1080ti. Nothing major, but some little spikes from 7ms to 13ms that is enough to annoy me. Regular MP seems fine. I did a fresh windows install, updated my bios and even got some new Ram, removed my OCs etc but it is still an issue.

Also, I can’t move the power slider past 100% in Afterburner. My old card I could go to 117%


----------



## sakete

I'm assuming all these issues will be resolved with driver updates.


----------



## Nammi

t1337dude said:


> "slap on a waterblock"
> 
> That's fine if you want to spend a bunch of extra money and put in a ton of extra effort, for not-so-great gains. It's the enthusiast thing to do.
> 
> Anyways I mispoke when I said the Gaming X Trio runs cooler, because it actually runs a tad on the warm side. Having said that, let's look at the end result...
> 
> Same 4k performance in Tomb Raider, yet the MSI Gaming X is doing it at 3dBA quieter (which is fairly significant). The TUF Mode gets around those noise levels with the down-clocked quiet BIOS, but you wouldn't be seeing it match the MSI Gaming Trio like it is above.
> 
> If getting every measly MHz for that 1-2 FPS gains is super important to you, then you're probably going to put a water block on it anyways. If you're not going to those lengths, then the MSI Gaming Trio is certainly competitive, or at least more competitive than what people seem to be saying about it.


The reason why I wrote off the Gaming X is the price, it would seem like it was priced too high by quite a few retailers here and has now been adjusted. Even after that it is still priced quite high as there are cards with dual bios, higher PL(if it's actually capped at 350W), better pcbs and about the same cooling that can be had for about the same or less.


----------



## Alemancio

To all that are crying hard because:

1. It wont overclock that much
2. It wont run certain games
3. You cant use the powersliders

You do know that you bought a rushed out product with not that much testing - right? Buying rev00 usually entails this too (other than bragging rights). Dont forget that while NVIDIA might have fudged this launch up by pre-emptively launching a beta product, it's a reputable company that will do you right - but it will take some time.

So please, enjoy your 3080, stop crying because if you do, sell it to me for MSRP and be done with it.


----------



## Nico67

cstkl1 said:


> btw this not my highest stable but chosed this as i think this the norm..
> View attachment 2459374


Wow that is weird, AB and bios are suggesting the same 117% based on a 320w default. However specs suggest it should be 340w default with 110%.
It almost looks like its getting 110% of 320w which would be about 353w at full slider, but if its pinging of the solid pwr cap and not showing 375w and higher then the overclocking apps don't seem to be reading something correctly.
I would really like to see a bios that allowed a slider to get up around 450w to see what it takes to get off pwr cap.


----------



## cstkl1

Alemancio said:


> To all that are crying hard because:
> 
> 1. It wont overclock that much
> 2. It wont run certain games
> 3. You cant use the powersliders
> 
> You do know that you bought a rushed out product with not that much testing - right? Buying rev00 usually entails this too (other than bragging rights). Dont forget that while NVIDIA might have fudged this launch up by pre-emptively launching a beta product, it's a reputable company that will do you right - but it will take some time.
> 
> So please, enjoy your 3080, stop crying because if you do, sell it to me for MSRP and be done with it.


1. you are totally correct on the 3080 even throttled its a damn silly of a gpu
2. but u cant blame others to try to sort issues now.


----------



## cstkl1

Nico67 said:


> Wow that is weird, AB and bios are suggesting the same 117% based on a 320w default. However specs suggest it should be 340w default with 110%.
> It almost looks like its getting 110% of 320w which would be about 353w at full slider, but if its pinging of the solid pwr cap and not showing 375w and higher then the overclocking apps don't seem to be reading something correctly.
> I would really like to see a bios that allowed a slider to get up around 450w to see what it takes to get off pwr cap.


asus tweaker been updated. same result. just didnt bother because it clashes with msi osd.

the voltage slider only seems to work at stock. saw it going up. but then when powerlimit throttle hits.. no diff.

so to wc or not.. hmmm 3 weeks time bits release date.

y bits. well they dont have the poor quality nickel plating like EK and it comes with the backplate. bits plexi rgb more diffused nicely.











the geforce etching glows nicely.

the new block are better than these.










two years running. still looks like new.


----------



## pewpewlazer

t1337dude said:


> "slap on a waterblock"
> 
> That's fine if you want to spend a bunch of extra money and put in a ton of extra effort, for not-so-great gains. It's the enthusiast thing to do.
> 
> Anyways I mispoke when I said the Gaming X Trio runs cooler, because it actually runs a tad on the warm side. Having said that, let's look at the end result...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Same 4k performance in Tomb Raider, yet the MSI Gaming X is doing it at 3dBA quieter (which is fairly significant). The TUF Mode gets around those noise levels with the down-clocked quiet BIOS, but you wouldn't be seeing it match the MSI Gaming Trio like it is above.
> 
> If getting every measly MHz for that 1-2 FPS gains is super important to you, then you're probably going to put a water block on it anyways. If you're not going to those lengths, then the MSI Gaming Trio is certainly competitive, or at least more competitive than what people seem to be saying about it.


Let's be real here. If you care about performance or noise, you've been putting water blocks on your graphics cards for years. 
250w TDP + air = LOL. 
320-350w TDP + air = ****.


----------



## cstkl1

omg zotac

__
https://www.reddit.com/r/ZOTAC/comments/iv4j17/_/g5q82zr

its be design he says less than FE.
pr dude should atleast check.. obviously he doesnt know of tuf existence.


----------



## Alemancio

cstkl1 said:


> omg zotac
> 
> __
> https://www.reddit.com/r/ZOTAC/comments/iv4j17/_/g5q82zr
> 
> its be design he says less than FE.
> pr dude should atleast check.. obviously he doesnt know of tuf existence.


I mentioned some posts before. Stay away from Zotac. Their cards are usually inferior. "But they use so many powerstages!" yeah, but INFERIOR. And now (3080s) they dont even use that many, with a poor design and on top their basic card is worse than FE? hahahahahahahaha...


----------



## cstkl1

Alemancio said:


> I mentioned some posts before. Stay away from Zotac. Their cards are usually inferior. "But they use so many powerstages!" yeah, but INFERIOR. And now (3080s) they dont even use that many, with a poor design and on top their basic card is worse than FE? hahahahahahahaha...


its by design bro by design.
the architect expected your response by design.
the design of da milkin machine.


----------



## Nico67

cstkl1 said:


> asus tweaker been updated. same result. just didnt bother because it clashes with msi osd.
> 
> the voltage slider only seems to work at stock. saw it going up. but then when powerlimit throttle hits.. no diff.
> 
> so to wc or not.. hmmm 3 weeks time bits release date.
> 
> y bits. well they dont have the poor quality nickel plating like EK and it comes with the backplate. bits plexi rgb more diffused nicely.
> 
> 
> View attachment 2459434
> 
> 
> the geforce etching glows nicely.
> 
> the new block are better than these.
> 
> View attachment 2459435
> 
> 
> two years running. still looks like new.


Watercooling will definitely make the clocks more stable and will give you a little more power as it cools the vrm down a bit to make it more efficient and no fans 
It might clock a bit higher, but without the power that's not going to help hugely.
Bitspower blocks are nicer than EK that''s for sure, just not as easy to obtain. EK always look a little rougher machined, but who knows that might actually cool a little better, but sure does pickup any gunk in the system


----------



## cstkl1

Nico67 said:


> Watercooling will definitely make the clocks more stable and will give you a little more power as it cools the vrm down a bit to make it more efficient and no fans
> It might clock a bit higher, but without the power that's not going to help hugely.
> Bitspower blocks are nicer than EK that''s for sure, just not as easy to obtain. EK always look a little rougher machined, but who knows that might actually cool a little better, but sure does pickup any gunk in the system


ek nickel plating corrodes. period. they been trying to solve this from day one. diff is now they got their own coolant. its a camouflage of their issues to blame as user error. 

bitspowe ships internationally with fedex/dhl. so what do you mean by difficult to obtain??. 2-3 days its in your hands. their payment gateway is directly with their bank which uses via/master hence simple otp and best exchange rate

aqua etc my issue in the past way paypal etc.. maybe changed now..


----------



## Falkentyne

cstkl1 said:


> dat another village idiot with stevo plus the rambling fool completes a trio.... ignore them.
> 
> 
> 
> 
> 
> 
> typical gameplay. powerlimit, cpu bottlenecked.. but still pushing frames nvr seen before while nvenc recording this at 4:4:4 ..
> this with quiet mode.. with cpu rad fans on low. was trying to listen for coil whines.. none..
> 
> dats 110.. max is 117 but no point
> da bios is limited to both pcie 8 [email protected] watt and pcie slot at just under 60. gpuz sensor shows da max. it spikes to 118
> 
> so the card max is 355-360 watt.
> 
> a noob with heave window mode+ab+ gpuz sensor mode can see it clearly.


Video looks fun. Should have bought this during the steam sale but I didn't 
What load vcore for 5.3 ghz in this game? CPU looks like it's getting nice and warm..


----------



## cstkl1

Falkentyne said:


> Video looks fun. Should have bought this during the steam sale but I didn't
> What load vcore for 5.3 ghz in this game? CPU looks like it's getting nice and warm..


the game needs many hours to be decent. theres the grind, learning mech, maps, etc.. 1k hour spend= you only be decent. 2k the mark where ppl start soloing it..

fan on rads running 500rpm.
was trying to hear any coil whine. even off the ac which was audible. so just ceiling fan on sleep mode which u cant hear much. the load for 5.3 vmin is 1.34v. i just ran it to check will it help with 3080 cpu bottleneck issue. nope.
@1440p i am guessing this gonna be problematic. also with the gpu throttling.. cant really pinpoint where the issue is at..

now da big dilemma. strix is arnd usd 250 more. water cooling a tuf will cost me usd 250. if this was a usd 1200 or pricr i paid for my trio usd 1600 two years ago.. no brainer. theres a .usd 700 gap between 3080/3090.. kindda silly if i get a strix 3080 and then wc it.. a 3080ti shows up. at usd 1k..

pretty sure i am not the only one facing this. i dont need the super variant aka 20gb. so the 3080ti.. i can confirm with you its real. you know that leak strix slideshow. that card is real.

its the reason why tuf was priced same as FE. the ref card turbo model is cheaper (not launched yet).

igor said gpuz sensor values are not true for FE. waiting for his tuf breakdown.


----------



## meicelinho

Hi folks,
is here the right place to ask for compatible water blocks for the EVGA RTX 3080 XC3 Black Gaming? For now, it looks like, that no block is available. I would like to get notified, if is there available to buy or at least to preorder.


----------



## ThrashZone

meicelinho said:


> Hi folks,
> is here the right place to ask for compatible water blocks for the EVGA RTX 3080 XC3 Black Gaming? For now, it looks like, that no block is available. I would like to get notified, if is there available to buy or at least to preorder.


Hi,
EK or watercool read some bitspower blocks coming but sure they will be stupid priced like all bitspower items.
EVGA might be best to get a hydro copper from them.


----------



## zhrooms

meicelinho said:


> Is here the right place to ask for compatible water blocks for the EVGA RTX 3080 XC3 Black Gaming? For now, it looks like, that no block is available


Too early to say, we still lack pictures of most cards (PCB), but so far it looks like most designs are based on NVIDIA 2x8-Pin PCB but slightly altered, some of them will be compatible with EK standard 3080/3090 block, and others not. Until we have pictures of them all, or direct confirmation by EK (or other block manufacturers), there's not much to say, except.. wait.


----------



## Brimlock

I'm wondering if my 7700k will be a bottleneck to the 3080 right now. Like this poor chip has held its ground for so long and can probably do for even longer, but has it reached a point that it could start to hinder my performance?


----------



## arrow0309

ThrashZone said:


> Hi,
> EK or watercool read some bitspower blocks coming but sure they will be stupid priced like all bitspower items.
> EVGA might be best to get a hydro copper from them.


You sure?
It's still a custom design, different from the FTW3 but still a custom one:


----------



## rluker5

Brimlock said:


> I'm wondering if my 7700k will be a bottleneck to the 3080 right now. Like this poor chip has held its ground for so long and can probably do for even longer, but has it reached a point that it could start to hinder my performance?


Not if you bottleneck your system even more with something else like a 60fps limit or graphics settings turned up to the point that you get a GPU bottleneck first.
Every cpu can be a bottleneck to this card depending on the resolution, graphics settings and how high the framerate cap is.

I'm going to get the highest settings I can at 4k with a 60 fps limit, so I would be fine with a 7700k, but I don't know what you plan on doing with your 7700k. 
Regardless, here are some convenient links pertinent to your quandary: How Much CPU Does the GeForce RTX 3080 Need? NVIDIA GeForce RTX 2080 Ti PCI-Express Scaling


----------



## Brimlock

rluker5 said:


> Not if you bottleneck your system even more with something else like a 60fps limit or graphics settings turned up to the point that you get a GPU bottleneck first.
> Every cpu can be a bottleneck to this card depending on the resolution, graphics settings and how high the framerate cap is.
> 
> I'm going to get the highest settings I can at 4k with a 60 fps limit, so I would be fine with a 7700k, but I don't know what you plan on doing with your 7700k.
> Regardless, here are some convenient links pertinent to your quandary: How Much CPU Does the GeForce RTX 3080 Need? NVIDIA GeForce RTX 2080 Ti PCI-Express Scaling


Oooh neat, thank you.


----------



## Vapochilled

Until we can raise the TDP i did undervolt...
currently 1915 @ 0.920 to avoid reaching 340W on my gigabyte eagle OC
When the game doesnt push much, it boots to 2005 with 1.02 v and more. 
So, the chip is good, the 8nm is able to work with low voltage but the TDP is killing us :\


----------



## sblantipodi

guys what is the maximum VRAM usage in modern games?
are there games that uses up to 10GB or more?
is this something that we should worry about before buying a 3080?


----------



## mouacyk

sblantipodi said:


> guys what is the maximum VRAM usage in modern games?
> are there games that uses up to 10GB or more?
> is this something that we should worry about before buying a 3080?











Is 8gb-10gb Vram enough for next gen? Can we settle this once and for all?


So as we all know the new Nvidia RTX30 series cards namely the rtx3070 and the rtx3080 are shipping with 8gb and 10gb VRAM respectively, I´ve read all kinds of arguments in favor and against the notion that 8gb-10gb is enough for next gen even at 2k and 4k resolutions at high frame rates. I am...




www.neogaf.com





Id Software dev tweet regarding VRAM future-proofing:

__ https://twitter.com/i/web/status/1301126502801641473
Some qwik mafs:
max 4K diffuse textures per scene with 10GB = (10737418240 (total bytes) * 8 (bits per byte)) / (3840*2160 * 32 (bits per pixel)) = (323.6 textures * 0.75 (25% for non-compressible non-texture assets)) * 4x (32-bit S3TC compression) = 970.8 / 4 (material types) = 242.7

Most games will have a combination of texture (including mip-map) sizes from 256 up to 4K and various types of materials for each texture, but it's safe to say you can have up to 240 different textures in any given scene. We used to play games where we can count the total number of textures in a scene on both hands.


----------



## sblantipodi

Still so undecided. Damn Nvidia, those 10GB are a marketing choice to make the cards beeing good for a very limited time


----------



## AlKappaccino

sblantipodi said:


> Still so undecided. Damn Nvidia, those 10GB are a marketing choice to make the cards beeing good for a very limited time


I feel like this: If you don't care about max OC scores, go for the 3080. If the 10GB vram would start to limit you, there will be 20GB cards by the time, assuming all the rumors are correct. And you can sell your 10GB card then and get one of those. You still pay less overall than for the 3090 now which doesn't look to compelling besides the high vram.


----------



## man from atlantis

Can any 3080 owner try the Nvidia Asteroids Mesh Shader Demo at 4K? It's not publicly available unles you're developer you can download here: 









easyupload.io


easyupload.io




easyupload.io


----------



## Mooncheese

Dude the ebay listings are hilarious: 









rtx 3080: Search Result | eBay


Buy and sell electronics, cars, fashion apparel, collectibles, sporting goods, digital cameras, baby items, coupons, and everything else on eBay, the world's online marketplace



www.ebay.com













Frustrated gamers are battling Ebay resellers with fake $50,000 bids and RTX 3080 'paper editions'


A bit of well-deserved trolling.




www.pcgamer.com


----------



## cstkl1

mvdcc on tuf is the fans


----------



## sakete

AlKappaccino said:


> I feel like this: If you don't care about max OC scores, go for the 3080. If the 10GB vram would start to limit you, there will be 20GB cards by the time, assuming all the rumors are correct. And you can sell your 10GB card then and get one of those. You still pay less overall than for the 3090 now which doesn't look to compelling besides the high vram.


I think that by the time 10GB starts to limit you, we'll have the 4080 or whatever. And if you don't game on 4K, you definitely don't need to worry about 10GB VRAM.


----------



## pewpewlazer

sblantipodi said:


> Still so undecided. Damn Nvidia, those 10GB are a marketing choice to make the cards beeing good for a very limited time


ALL graphics cards are "good for a very limited time". About 2 years on average. Then next gen comes out and they're old news.

By time 10gb vram becomes a limitation, a 3080 will be nowhere near enough performance to run games at high enough settings where 10gb vram is required.


----------



## bkrownd

I expect to get at least 4 years out of mine


----------



## HyperMatrix

For those who said this was a paper launch with no actual consumer interest, Zotac had 20,000 preorders for their card through Amazon alone. That's a massive amount for one of the worst built RTX 3080 cards released, through just 1 site.









ZOTAC received 20,000 orders for GeForce RTX 3080 Trinity through Amazon alone - VideoCardz.com


ZOTAC: Expect weeks of waiting for RTX 3080 The launch of the GeForce RTX 3080 will be remembered as the worst launch of GeForce graphics card in recent history. The non-existent stock has been depleted in a matter of minutes, sometimes seconds. It was confirmed that scalpers have used bots for...




videocardz.com


----------



## Zemo

bkrownd said:


> I expect to get at least 4 years out of mine


I expect to purchase it after 4 years as well.


----------



## zhrooms

HyperMatrix said:


> For those who said this was a paper launch with no actual consumer interest, Zotac had 20,000 preorders for their card through Amazon alone. That's a massive amount for one of the worst built RTX 3080 cards released, through just 1 site.


No, anyone who believe it's poorly built really don't have the slightest clue of what's going on, this is standard practice in the industry and has been for a long time.

Zotac RTX 3080 Trinity PCB is based on NVIDIA Reference 3090 20 Power Stage 2x8-Pin PCB, slightly modified, no NVLink or memory modules on the back, as well as extended PCB slightly (length) for their Fan/RGB connectors.

This leaves Zotac with the possibility of using 20 power stages for their 3090, just like the NVIDIA Reference cards, but we're talking about 3080 here, which has 17% less CUDA cores, it does not need the full 20 power stages, and since this is a cheap entry level model, with nothing fancy such as high factory overclocking, they gave it 320W TDP like the FE and for that power consumption, 20 power stages would be completely unnecessary, so to save money on production they removed 4 power stages, 1 from the memory because it's just 10 modules instead of 24 on the 3090, and 3 from the GPU. This leaves the card with 16 power stages (13+3) which is identical to 2080 Ti 260/320W FE (also 8704 CUDA Cores), and since Zotac does not provide a factory overclock (because it's entry level card as mentioned), they didn't bother giving you more than 336W max power limit up from the stock 320W.

The reason Founders Edition is using 18 power stages (2 more on the GPU) is because they also wanted to give the card a decent amount of overclocking headroom, so it features a 370W maximum power limit up from 336W on Zotac. You need to understand that Founders Edition is not entry level anymore, since Turing specifically, this time it even features its own custom PCB with 12-pin power connector because of their unique cooler design. 

So no, you cannot compare entry level Zotac Trinity that uses (slightly modified) NVIDIA Reference 2x8-Pin PCB to NVIDIA Founders Edition Custom PCB 1x12-Pin.

Zotac Trinity VRM (13+3) is more than enough already, no issues pulling 500W+, you just gotta find a custom BIOS that works on it (or shunt mod it).


----------



## HyperMatrix

zhrooms said:


> No, anyone who believe it's poorly built really don't have the slightest clue of what's going on, this is standard practice in the industry and has been for a long time.
> 
> Zotac RTX 3080 Trinity PCB is based on NVIDIA Reference 3090 20 Power Stage 2x8-Pin PCB, slightly modified, no NVLink or memory modules on the back, as well as extended PCB slightly (length) for their Fan/RGB connectors.
> 
> This leaves Zotac with the possibility of using 20 power stages for their 3090, just like the NVIDIA Reference cards, but we're talking about 3080 here, which has 17% less CUDA cores, it does not need the full 20 power stages, and since this is a cheap entry level model, with nothing fancy such as high factory overclocking, they gave it 320W TDP like the FE and for that power consumption, 20 power stages would be completely unnecessary, so to save money on production they removed 4 power stages, 1 from the memory because it's just 10 modules instead of 24 on the 3090, and 3 from the GPU. This leaves the card with 16 power stages (13+3) which is identical to 2080 Ti 260/320W FE (also 8704 CUDA Cores), and since Zotac does not provide a factory overclock (because it's entry level card as mentioned), they didn't bother giving you more than 336W max power limit up from the stock 320W.
> 
> The reason Founders Edition is using 18 power stages (2 more on the GPU) is because they also wanted to give the card a decent amount of overclocking headroom, so it features a 370W maximum power limit up from 336W on Zotac. You need to understand that Founders Edition is not entry level anymore, since Turing specifically, this time it even features its own custom PCB with 12-pin power connector because of their unique cooler design.
> 
> So no, you cannot compare entry level Zotac Trinity that uses (slightly modified) NVIDIA Reference 2x8-Pin PCB to NVIDIA Founders Edition Custom PCB 1x12-Pin.
> 
> Zotac Trinity VRM (13+3) is more than enough already, no issues pulling 500W+, you just gotta find a custom BIOS that works on it (or shunt mod it).


I said it’s one of the worst built RTX 3080 cards. Want to prove me wrong? Show me a list of RTX 3080 cards with worse quality/potential. All you’re doing is trying to argue that it’s adequate for its intended purpose, which isn’t related to my claim.


----------



## nick name

HyperMatrix said:


> I said it’s one of the worst built RTX 3080 cards. Want to prove me wrong? Show me a list of RTX 3080 cards with worse quality/potential. All you’re doing is trying to argue that it’s adequate for its intended purpose, which isn’t related to my claim.


I don't think you understand how credibility works. If you're the one making the claim "one of the worst built" then the burden is on you to prove your claim. You keep expecting people to simply believe you and then when confronted you challenge them to prove you wrong. It's on you to make us believe you. Prove yourself right.


----------



## zhrooms

HyperMatrix said:


> I said it’s one of the worst built RTX 3080 cards. Want to prove me wrong? Show me a list of RTX 3080 cards with worse quality/potential.


Sure 👌, go the first page of this thread 😆

XC3, Phoenix, SG, Twin X2, iChill X3, iChill X4, Ventus, Gaming X Trio, Founders Edition, GamingPro, XLR8, Trinity are all using the same base design (NVIDIA Reference 20 Power Stage 2x8-Pin).
(Unknown: EX)
They are all equally good (all use double PWM controllers for true phases, no paired/parallel stages, up to partners how many power stages they actually put on there, varies from 16 (13+3) to 20 (16+4). EagleOC and GamingOC are both 19 max which is what you're thinking of, that's an actual worse PCB, they used 17 stages on 3080 and 19 stages on 3090 (EagleOC).

EagleOC 3080 (14+3) vs EagleOC 3090 (15+4), only one more power stage for GPU on 3090, now that's an awful card. Trinity 3080 is 13+3 and Trinity 3090 is 16+4, exactly as expected.

Power Stages also has nothing to do with overclocking, fewer stages just means higher VRM temp, more stages means lower VRM temp, not better overclocking. Nothing is stopping Zotac from using 10+3 power stages and giving it a stock power limit of 320W and maximum of 375W. Would outperform most cards just VRM would run 10°C higher, no biggie. Just like 13+3 on 2080 Ti had no issues running 600W Time Spy Extreme, VRM just got really hot at up to 120°C with inadequate cooling, on water, temps typically did not exceed 60°C on any part of the card except shunts.


----------



## gerardfraser

@*zhrooms*
I agree with you 100%. I believe all the RTX 3080 cards are all just about equal and all providing around the same performance.


----------



## zhrooms

gerardfraser said:


> @*zhrooms*
> I agree with you 100%. I believe all the RTX 3080 cards are all just about equal and all providing around the same performance.


The only thing that mattered on 2080 Ti was the *power limit, power limit, power limit!*

I bought a Palit GamingPro (non-OC) model for $1025 pre-tax with a 250-280W power limit, that I overclock to 2205MHz 1.125V using GALAX XOC BIOS. Non-A chips was also just marketing bullshit from NVIDIA, plenty of people shunt modded them and reached the same 2175MHz 1.093V, there were no binning on any card except maybe Kingpin, they claimed it was but we have no actual proof of it, we have to take their word for it, which I do since the card was limited and had a MSRP of around $1999. (GALAX OC Labs does not count as that was a super limited edition card only sold directly by GALAX in Asia).

So, Palit GamingPro 280W with 100% NVDIA Reference PCB (identical in every way to Founders Edition 320W) had zero issues reaching max clock of 2175MHz at the NVIDIA 1.093V hard limit (pulling over 500W in Time Spy, 550W in Extreme). Just like every AORUS Xtreme, FTW3 Ultra, Hall of Fame, Strix OC and so on, also with the XOC BIOS, but they had vastly improved VRM design and PWM controllers, resulting in up to 30°C lower VRM temps, yet stopped by the voltage limit just like my Palit at 1.093V (1.125V on GALAX/Kingpin XOC). All custom PCB cards are worthless for Air and Water cooling, they won't give you a higher overclock, ultimate marketing ploy by partners, advertises them as some godly GPUs when they're no better than the cheapest cards on the market. Strix OC had a 5W higher power limit than FE, for $150 more. Meanwhile Gigabyte Windforce OC and GamingOC both running on FE PCB came with a 366W power limit, 41W higher than Strix OC, and cost $100 less, GamingOC cooler was massive triple fan 2.5 slot just like Strix OC, making it a vastly better card, meaning Strix owners basically paid $100 more to get a useless VRM (30°C lower don't matter) and 41W lower power limit (this absolutely matters). *Strix OC PCB was made for LN2 overclocking*, that's what people don't seem to be aware of, *it has actual soldering points to enable voltage control* and more, crazy **** literally. That's why you (*SHOULD*) buy Strix OC, to use voltage control with the Strix XOC BIOS. But to use the card under Air or Water is a *complete waste of money.* I guess you're allowed to buy it for looks but you could literally buy around 40 other cards that features a higher power limit for less money. *Strix OC was one of the absolute worst cards on Turing, for air and water overclocking.*

This is why I absolutely *love* their new *TUF* brand on Ampere, because it's *NOT* an LN2 card, it's basically a Strix but removed LN2 capability, *for a much lower price.*

Anyone who is interested in buying Strix OC for Ampere should most likely get the TUF OC instead. It still offers Dual BIOS and an extra HDMI 2.1 port, then full 20 power stage custom PCB with improved power delivery PWM controllers, it has the same/similar VRM to what 3090 cards get, but on a 3080, which is very impressive.

They also completely maxed out the power limit on 2x8-Pin connectors, it ships at 340W stock and let you increase it to the full 375W which is max spec for 2x8-Pin. TUF OC is basically a completely maxed out "RTX 3080 2x8-Pin" in every way, the only cards that will beat it is FTW3 and Strix that we know of, AORUS Xtreme we don't yet know the power limit of, but very likely also higher power limit, and Hall of Fame is yet to be announced, Kingpin is not available for the RTX 3080, likely not MSI Lightning either. This means that TUF will remain as the best 2x8-Pin card and still the 4th best including 3x8-Pin cards (AORUS Xtreme ---W, FTW3 420W and Strix OC 400W), for a considerably lower price.

If we get any kind of XOC BIOS with 600W or above for 3080, and it's compatible with cheap NVIDIA Reference 2x8-Pin PCB cards such as Zotac Trinity, it will basically make all cards worthless if you cool them on water, cheapest one you can find will be the best, overclock identical to all the premium priced cards, which is kind of insane, you'd think NVIDIA would be very against this type of freedom, and partners, someone like me and anyone aware of this has no reason to give them more money for premium cards, because with XOC BIOS they aren't premium anymore, they become terrible value cards instead.


----------



## asdkj1740

zhrooms said:


> Sure 👌, go the first page of this thread 😆
> 
> XC3, Phoenix, SG, Twin X2, iChill X3, iChill X4, Ventus, Gaming X Trio, Founders Edition, GamingPro, XLR8, Trinity are all using the same base design (NVIDIA Reference 20 Power Stage 2x8-Pin).
> (Unknown: EX)
> They are all equally good (all use double PWM controllers for true phases, no paired/parallel stages, up to partners how many power stages they actually put on there, varies from 16 (13+3) to 20 (16+4). EagleOC and GamingOC are both 19 max which is what you're thinking of, that's an actual worse PCB, they used 17 stages on 3080 and 19 stages on 3090 (EagleOC).
> 
> EagleOC 3080 (14+3) vs EagleOC 3090 (15+4), only one more power stage for GPU on 3090, now that's an awful card. Trinity 3080 is 13+3 and Trinity 3090 is 16+4, exactly as expected.
> 
> Power Stages also has nothing to do with overclocking, fewer stages just means higher VRM temp, more stages means lower VRM temp, not better overclocking. Nothing is stopping Zotac from using 10+3 power stages and giving it a stock power limit of 320W and maximum of 375W. Would outperform most cards just VRM would run 10°C higher, no biggie. Just like 13+3 on 2080 Ti had no issues running 600W Time Spy Extreme, VRM just got really hot at up to 120°C with inadequate cooling, on water, temps typically did not exceed 60°C on any part of the card except shunts.


the official product page of rtx 3080 eagle oc stated it was in 13+4 compared to reference 13+3. but recently that statement is gone.

back in the days there are so many ppl flashing the xoc bios (no power limit) on gtx 1080 founder edition which have 5 phases with sub par mosfets only. i highly doubt 13 phases on gpu cant handle ~350w.
on z490 those good mobos with 12 phases 50a/60a (without active cooling) can already provide ~330w.

i have no idea why ppl currently would blame the 3080 reference pcb's vrm... it is clearly locked by bios...


----------



## nievz

If you ran the MSI X Trio 3080 say at 80% fan will it match the TUF's thermal performance? Is it just the fan curve holding it back? I think I read somewhere that the TUF uses high RPM fans.


----------



## Mooncheese

There's a crypto-company selling 3080's at a mark-up here, I have a few problems with this: 

1. They are encouraging sales of these cards go to crypto-miners. 
2 They are selling at a mark-up, meaning, they are scalping. 

Any way to report this to get them off of OC.net? 






RTX 3080


Our two Mining branch warehouses are located in the Netherlands and Hong Kong depending on the models ordered, your order will be shipped from one of these two logistics centres. For processing and delivery times, they are currently 1 working days after confirmation of payment (This time...




www.overclock.net


----------



## nick name

Mooncheese said:


> There's a crypto-company selling 3080's at a mark-up here, I have a few problems with this:
> 
> 1. They are encouraging sales of these cards go to crypto-miners.
> 2 They are selling at a mark-up, meaning, they are scalping.
> 
> Any way to report this to get them off of OC.net?
> 
> 
> 
> 
> 
> 
> RTX 3080
> 
> 
> Our two Mining branch warehouses are located in the Netherlands and Hong Kong depending on the models ordered, your order will be shipped from one of these two logistics centres. For processing and delivery times, they are currently 1 working days after confirmation of payment (This time...
> 
> 
> 
> 
> www.overclock.net


Welp that disappeared quick.


----------



## Mooncheese

nick name said:


> Welp that disappeared quick.


Probably because I reported it.


----------



## Nizzen

Hello from Norway


----------



## Zemo

Bring on some benchmarks


----------



## asdkj1740

nievz said:


> If you ran the MSI X Trio 3080 say at 80% fan will it match the TUF's thermal performance? Is it just the fan curve holding it back? I think I read somewhere that the TUF uses high RPM fans.


definitely closing the gap. 2000rpm would have a lot. the stock fan curve is set to be running at around 1000~1500rpm.

optimum tech and hardwareunboxed have already shown that.


----------



## Nizzen

Zemo said:


> Bring on some benchmarks


Nzz1 is my user:


https://www.3dmark.com/hall-of-fame-2/timespy+3dmark+score+performance+preset/version+1.0/1+gpu


Pretty good on stock air cooling.

Timespy Extreme:


----------



## Nizzen

nr #9 on Firestrike Ultra 









3DMark Fire Strike Ultra Hall of Fame


The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.




www.3dmark.com




# 11 Port Royal:








3DMark Port Royal Hall of Fame


The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.




www.3dmark.com





A few hours LOL


----------



## KingEngineRevUp

Nizzen said:


> nr #9 on Firestrike Ultra
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMark Fire Strike Ultra Hall of Fame
> 
> 
> The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> # 11 Port Royal:
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMark Port Royal Hall of Fame
> 
> 
> The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> A few hours LOL


You have an air conditioner blowing inside your system or something? 45C is super low.


----------



## Nizzen

KingEngineRevUp said:


> You have an air conditioner blowing inside your system or something? 45C is super low.


Garage 😎

Maybe 15c


----------



## ViTosS

Nizzen said:


> Hello from Norway


A bit off-topic, but where did you buy that ''stem'' which holds the fan to your memory RAM? Where do you plug that?


----------



## Nizzen

ViTosS said:


> A bit off-topic, but where did you buy that ''stem'' which holds the fan to your memory RAM? Where do you plug that?











ICEAGE J1 High Efficient Performance Flexible Metal Fan Mounting Kit


Buy ICEAGE J1 High Efficient Performance Flexible Metal Fan Mounting Kit for $14.99 with Free Shipping Worldwide (In Stock)




www.moddiy.com





Mount it in one of the MB holes with normsl MB screw 

Or mount it elsewhere


----------



## Imprezzion

I build a bracket myself that mounts to the radiator screws of the top rad and holds a fan above the memory that way. Works great!

It took over 10c off my memory temps with them now running low 40c range in heavy usage on 1.55v. 

But, on topic, that cooler seems to perform exceptionally well lol. Palit didn't always have the best coolers but this is amazing even for 15c ambients. That would still be in the 50c range on 20-23c ambient. Amazing lol. How much noise does it produce? I might get one of these if it's quiet enough as well!


----------



## HyperMatrix

zhrooms said:


> Sure 👌, go the first page of this thread 😆
> 
> XC3, Phoenix, SG, Twin X2, iChill X3, iChill X4, Ventus, Gaming X Trio, Founders Edition, GamingPro, XLR8, Trinity are all using the same base design (NVIDIA Reference 20 Power Stage 2x8-Pin).
> (Unknown: EX)
> They are all equally good (all use double PWM controllers for true phases, no paired/parallel stages, up to partners how many power stages they actually put on there, varies from 16 (13+3) to 20 (16+4). EagleOC and GamingOC are both 19 max which is what you're thinking of, that's an actual worse PCB, they used 17 stages on 3080 and 19 stages on 3090 (EagleOC).
> 
> EagleOC 3080 (14+3) vs EagleOC 3090 (15+4), only one more power stage for GPU on 3090, now that's an awful card. Trinity 3080 is 13+3 and Trinity 3090 is 16+4, exactly as expected.
> 
> Power Stages also has nothing to do with overclocking, fewer stages just means higher VRM temp, more stages means lower VRM temp, not better overclocking. Nothing is stopping Zotac from using 10+3 power stages and giving it a stock power limit of 320W and maximum of 375W. Would outperform most cards just VRM would run 10°C higher, no biggie. Just like 13+3 on 2080 Ti had no issues running 600W Time Spy Extreme, VRM just got really hot at up to 120°C with inadequate cooling, on water, temps typically did not exceed 60°C on any part of the card except shunts.


Yeah you're totally right. The Zotac card is amazing. That's why tweaktown's review just said this. Literally the worst 3080 reviewed so far anywhere. But yup. You're right. All those other manufacturers are stupid for not doing what Zotac is doing. Doing a great service to other members by sending them to go buy this turd of a card. 

After an hour in heavy GPU load situations, the ZOTAC GeForce RTX 3080 Trinity would hit a ceiling of around 1870MHz or so -- with my sample sitting comfortably at 1830-1840MHz. This is 150MHz+ under the GeForce RTX 3080 Founders Edition and MSI RTX 3080 GAMING X TRIO.

Read more: ZOTAC GeForce RTX 3080 Trinity Review


----------



## VPII

Okay I got my Palit RTX 3080 GamingPro OC today. Played around with it to see stock performance in Time Spy, no Power limit adjustments and I got this.



https://www.3dmark.com/spy/14046561



adjusting the power limit I got this including setting performance in Nvidia contral panel



https://www.3dmark.com/spy/14046743



adjusted power limit with + 30mhz core gave me this



https://www.3dmark.com/spy/14047109



Increase core + 60mhz with 2130 core mhz gave me



https://www.3dmark.com/spy/14047283



Could not go higher or core so I increased the memory and finally settled on +750mhz which gave me the following with core + 60mhz. Oh and CPU speed increased to 4.5ghz

Time Spy


https://www.3dmark.com/spy/14048039



Time Spy Extreme


https://www.3dmark.com/spy/14048247



Port Royal


https://www.3dmark.com/pr/320633



I'll be honest, it is maybe not the best, but I am sure if we get a bios to increase power limit I'll get scores a lot better. I mean look at the 3dmark stated average clock for the gpu


----------



## acoustic

I must say, I'm not sure if I'm disappointed with the 3080, or impressed with my 2080TI; I'm only about ~1250 off your Port Royal score.

Don't know if I should wait until the AMD cards, or just try to snag a 3090 in two days. Decisions ..


----------



## Alemancio

You guys should really just play games and be happy.


----------



## zhrooms

Alemancio said:


> You guys should really just play games and be happy.


😆
*MSI Peter defending 350W power limit on 3x8-Pin;*
_No no no, we want the people be happy and play games._


----------



## shallow_

Nizzen said:


> Hello from Norway


Happy to see a fellow Norwegian who actually bought the card to use it..

We should make a scalpers 'wall of shame' to list the names of all 3080 scalpers.

I have started taking note of all the sellers using their full name in listings here in Norway..

I knew it was going to be challenging when only going for the Gaming Trio X or Asus Strix, (especially with the Strix cards not even out yet  )

Im keeping my MSI order open for now, to see when it ships, but its the Asus I really want.

Congrats on your purchase man


----------



## hisXLNC

so which custom cards are good for watercooling?


----------



## ThrashZone

Hi,
12 pin power connector so much fuss over nothing


----------



## cstkl1

its insane it can even do this at max 4k while recording via obs in 4:4:4 bt709 , indistinguishable quality .. 74gb for 25min.


----------



## gerardfraser

Well my Zotac RTX 3080 is a no show,MSI Trio RTX 3080 no show but I got one of these today. Asus Tuf RTX 3080 and disappointment right off the bat. 
LG B9 OLED 65" will not run at any Hz without blanking out to black screen ,not even 1920x1080 120Hz on 10/12bit. My Zotac RTX 2080 has no problem running with same HDMI cable and setup. 
Come on LG and Nvidia.
Cable is fine





Amazon.com: Zeskit 8K Ultra HD High Speed 48Gpbs HDMI Cable 6.5ft, 8K60 4K120 144Hz eARC HDR10 4:4:4 HDCP 2.2 & 2.3 Compatible with Dolby Vision Xbox PS4 PS5 Apple TV 4K Roku Fire TV Switch Vizio Sony LG Samsung: Industrial & Scientific


Buy Zeskit 8K Ultra HD High Speed 48Gpbs HDMI Cable 6.5ft, 8K60 4K120 144Hz eARC HDR10 4:4:4 HDCP 2.2 & 2.3 Compatible with Dolby Vision Xbox PS4 PS5 Apple TV 4K Roku Fire TV Switch Vizio Sony LG Samsung: HDMI Cables - Amazon.com ✓ FREE DELIVERY possible on eligible purchases



www.amazon.com


----------



## Alemancio

Im convinced it's impossible to buy a 3080 RTX these days. I refreshed until finding "add to cart" way before trackers noticed stock (push notif hasnt been sent) and STILL wasnt able to buy the card..............


----------



## Chargeit

Alemancio said:


> Im convinced it's impossible to buy a 3080 RTX these days. I refreshed until finding "add to cart" way before trackers noticed stock (push notif hasnt been sent) and STILL wasnt able to buy the card..............


I haven't seen one in stock in days.


----------



## gerardfraser

Alemancio said:


> Im convinced it's impossible to buy a 3080 RTX these days. I refreshed until finding "add to cart" way before trackers noticed stock (push notif hasnt been sent) and STILL wasnt able to buy the card..............


I am not trying to be that guy.
The card I got today,was purchased Sept 22 2020 7:27 PM Canadacomputers. Canada dollars


Spoiler


----------



## Imprezzion

I'm still waiting for open box returns to become available locally in webshops. Only problem is, no one is going to return a card they can sell secondhand for a big mark up...

EDIT: I just saw EVGA's tweets that thousands of cards would be shipped to etailers this week and supply should be a lot better. Finally I can order a FTW3 Ultra when they arrive haha.


----------



## Flisker

Imprezzion said:


> EDIT: I just saw EVGA's tweets that thousands of cards would be shipped to etailers this week and supply should be a lot better. Finally I can order a FTW3 Ultra when they arrive haha.


I can't see that tweet, would you be able to link it to me?


----------



## Chargeit

Up on Best Buy,



https://www.bestbuy.com/site/pny-geforce-rtx-3080-10gb-xlr8-gaming-epic-x-rgb-triple-fan-graphics-card/6432655.p?skuId=6432655



And, it's gone. Waiting for an Msi/Asus/Gigabyte/Evga myself.


----------



## shallow_

First video ive found of the Asus Strix 3080


----------



## doom26464

gerardfraser said:


> I am not trying to be that guy.
> The card I got today,was purchased Sept 22 2020 7:27 PM Canadacomputers. Canada dollars
> 
> 
> Spoiler


There website list as not avaible online. 

So must be in store purchases only.


----------



## gerardfraser

Wonder when new stock comes in,I am sure I posted a couple places where cards where available to buy on pre-order that will be in stock in a short time.


----------



## doom26464

gerardfraser said:


> Wonder when new stock comes in,I am sure I posted a couple places where cards where available to buy on pre-order that will be in stock in a short time.


If you know of where please post up. 

So far in Canada the only place I was able to put a pre order down was with memory express. However memory express is always the slowest for getting new stock of stuff it seems.


----------



## man from atlantis

Strix or FTW3?, decisions decisions


----------



## ThrashZone

man from atlantis said:


> Strix or FTW3?, decisions decisions


Hi,
Not really if you've ever rma'ed something evga all the way lol


----------



## sakete

ThrashZone said:


> Hi,
> Not really if you've ever rma'ed something evga all the way lol


Yes, EVGA for sure. Just cross-shipped a PSU with them, easiest RMA I've ever done.


----------



## Imprezzion

Flisker said:


> I can't see that tweet, would you be able to link it to me?


I got it from Tom's.









EVGA to Stock 'Thousands' of Nvidia RTX 3080 Ampere GPUs Soon


Here come the cards...at least some of them




www.tomshardware.com


----------



## gerardfraser

doom26464 said:


> If you know of where please post up.
> 
> So far in Canada the only place I was able to put a pre order down was with memory express. However memory express is always the slowest for getting new stock of stuff it seems.


Yesterday I posted in the other 3080 thread with 18 cards for pre-order two different spots.CDW and shopRBC ,I made a list of about 50 Canadian shops,have not updated this list may 13 2020 so expect a couple shops shut down with covid. The list is also on this site if you wanna look on this site








Looking for computer parts in Canada list of some stores


For my fellow Canadians search online stores in Canada for your computer needs at 50+ Stores.My personal list I been using for years. Update for that...




forums.guru3d.com


----------



## Professor McNasty

ThrashZone said:


> Hi,
> Not really if you've ever rma'ed something evga all the way lol


I agree. EVGA has some of the best customer service in the business. Waiting on my 3080 XC3 Ultra Gaming.


----------



## Shadowdane

Does anyone know if MSI Afterburner can control all 3 fans on EVGA 30 series cards??

I know previously with my EVGA 1080Ti afterburner didn't support their fan controllers and would only control a single fan, not all 3. I really prefer Afterburner personally not a fan of Precision software.


----------



## Professor McNasty

Has anyone heard anything about when EKWB might be showcasing more waterblocks for the AIB cards? I've got an EVGA RTX 3080 XC3 Ultra Gaming on backorder and am hoping to have a watercooling solution for it ASAP. Best I can see is EKWB has said:



EKWB said:


> EK water blocks for EVGA XC3 and ASUS TUF Series graphics cards will follow in a matter of weeks, while the Founders Edition water blocks will take a bit longer since EK is preparing something really special for those graphics cards.


----------



## slopokdave

Hi everyone. I'm not new to overclocking, but I am new to possibly flashing another vbios. I've got an EVGA XC3 3080 on the way, should it be as simple as backing up my vbios and flashing a 3080 FTW3 vbios? FTW3 has 3 8-pins instead of 2, but it appears that 3080 XC3 is limited to 320W via vbios so this should give me at least some boost, right? Thanks in advance... I've got a full custom loop, so temperatures are not a concern.


----------



## LesPaulLover

slopokdave said:


> Hi everyone. I'm not new to overclocking, but I am new to possibly flashing another vbios. I've got an EVGA XC3 3080 on the way, should it be as simple as backing up my vbios and flashing a 3080 FTW3 vbios? FTW3 has 3 8-pins instead of 2, but it appears that 3080 XC3 is limited to 320W via vbios so this should give me at least some boost, right? Thanks in advance... I've got a full custom loop, so temperatures are not a concern.


Is this confirmed working? With a card like the 3080 in such short support of I'd


acoustic said:


> I must say, I'm not sure if I'm disappointed with the 3080, or impressed with my 2080TI; I'm only about ~1250 off your Port Royal score.
> 
> Don't know if I should wait until the AMD cards, or just try to snag a 3090 in two days. Decisions ..


The 3090 is only "10-15% fast @ 4K" according to a post today from Nvidia themselves.


----------



## ChaosBlades

Juan confirmed in one of the comments in the ASUS PC DIY Facebook group that YES the Strix OC and Strix non-OC have the same board power limit. Same question for the TUF OC vs non OC model.

So the Non-OC TUF card can be updated with 340/375 power limit just like the OC model.


----------



## Rbk_3

Ended up selling mine. Was having way too many issues in Warzone, I’m going to wait for more mature drivers and get the actual card I want. Might pick up an EVGA 2070S and get in que to step up to a FTW3 Ultra. Camped out for 12 hours for a whole lot of headache. 

Sent from my iPhone using Tapatalk


----------



## slopokdave

LesPaulLover said:


> Is this confirmed working? With a card like the 3080 in such short support of Id


Haha didn't finish your thought? I have no idea though. I'm certainly not willing to be the first to try it!


----------



## ChaosBlades

Juan just confirmed on Facebook that the Stix / Strix OC has a power target at 100% of 370w. Trying to get him to confirm on max power target.

So TUF / TUF OC 340/375 and Stix / Strix OC 370/---


----------



## slopokdave

ChaosBlades said:


> Juan just confirmed on Facebook that the Stix / Strix OC has a power target at 100% of 370w. Trying to get him to confirm on max power target.
> 
> So TUF / TUF OC 340/375 and Stix / Strix OC 370/---


Strix was 370/420 I think? Maybe even 440? He said it on a live stream, I can't remember..


----------



## HyperMatrix

slopokdave said:


> Strix was 370/420 I think? Maybe even 440? He said it on a live stream, I can't remember..


FTW3 is 420W. I haven't heard of any other 3080 being 420W or 440W. Even the ROG Strix 3000 series page says it has "up to 400W." So if something's changed, it hasn't been updated on their website. It would also be great news. If you have a source, share it please. Thanks.


----------



## slopokdave

HyperMatrix said:


> FTW3 is 420W. I haven't heard of any other 3080 being 420W or 440W. Even the ROG Strix 3000 series page says it has "up to 400W." So if something's changed, it hasn't been updated on their website. It would also be great news. If you have a source, share it please. Thanks.
> 
> View attachment 2459864


Sorry yeah I may be confusing it with FTW3... They are starting to blur together after this week. 🤣


----------



## VPII

I got my Palit RTX 3080 GamingPro OC this week after placing the order on 17 September 2020. Yes I got one as I live in South Africa so we don't have people that buy to sell at astronomical pricing. The card is great, make no mistake. It is just a pitty that stock 320watt and increased you get 350watt. There is so much performance left on the table due to the power limit being way too low.

I ordered before actually checking reviews of it and I have to say the results the reviews got is pretty low compared to what I am seeing. More so the Guru3d review. I mean look at the Hexus review of the card compared to Guru3d. Seriously. Yes Hexus did use a Ryzen 9 3950X and their result ties in with mine when looking at Time Spy although mine is somewhat higher as my CPU at 4.2ghz manual overclock. But Guru3d with a 9900K their result would have been a lot higher. I did however notice that in the beginning of the review the heading states Palit RTX 3080 GamingPro OC and later just GamingPro without an OC. Even so, I do feel their result are poor for the card.


----------



## nievz

Rbk_3 said:


> Ended up selling mine. Was having way too many issues in Warzone, I’m going to wait for more mature drivers and get the actual card I want. Might pick up an EVGA 2070S and get in que to step up to a FTW3 Ultra. Camped out for 12 hours for a whole lot of headache.
> 
> Sent from my iPhone using Tapatalk


Why, what did you get? Are you experiencing the CTD issues? Is downclocking a workaround?


----------



## Vapochilled

I'm doing 1920 @ 0.91v and [email protected],96v -> this gives me a stable line hitting 340W power limit
Depends on the game and resolution. I play all games in 5Kx1440p, 32:9 ultra.
Some games put the 340W with those settings, other games are less heavy and i can do [email protected] without hitting 340W...

So, when people talk about powerlimits and this and that about OC, keep in mind this goes game to game and depends on resolution.
TimeSpy Ultra makes me go down to [email protected] and hitting 340W all the time...

Also, i disagree about raising the TDP from 340W to 420W, just to increase from 1930 to 1980 or so.
Thats like 25% more power for 3% or less more...


----------



## HyperMatrix

Vapochilled said:


> I'm doing 1920 @ 0.91v and [email protected],96v -> this gives me a stable line hitting 340W power limit
> Depends on the game and resolution. I play all games in 5Kx1440p, 32:9 ultra.
> Some games put the 340W with those settings, other games are less heavy and i can do [email protected] without hitting 340W...
> 
> So, when people talk about powerlimits and this and that about OC, keep in mind this goes game to game and depends on resolution.
> TimeSpy Ultra makes me go down to [email protected] and hitting 340W all the time...
> 
> Also, i disagree about raising the TDP from 340W to 420W, just to increase from 1930 to 1980 or so.
> Thats like 25% more power for 3% or less more...


FTW3 hits 2040-2070MHz on 410W.


----------



## asdkj1740

gigabyte 3090 eagle pcb


















source


https://www.bilibili.com/video/BV1w54y117jx





btw in the live stream of aorus last week the gigabyte guy (TW based) said aorus 3080 is aiming at 450w, and the card is 3.5 slot, and the reason of no pcb supporting stuffs is because the backplates on gigabyte rtx 3000 series has been reinforced, strong enough to hold the pcb and the heatsink and you wont see sag at all (super big claim).


evga jacob explains why evga 3080 ultra with higher factory overclocked performs worse than fe.


----------



## Vapochilled

HyperMatrix said:


> FTW3 hits 2040-2070MHz on 410W.


It can HIT... but its unplayable.
whenever you go above 2000Mhz on the core, the games crash. I've experienced that on F1 2019 and COD.
There is an article also on videocardz about it


----------



## HyperMatrix

Vapochilled said:


> It can HIT... but its unplayable.
> whenever you go above 2000Mhz on the core, the games crash. I've experienced that on F1 2019 and COD.
> There is an article also on videocardz about it


You sure it crashed with FTW3? It was stable through repeated timespy/port royal stress tests/benchmarks.


----------



## Vapochilled

HyperMatrix said:


> You sure it crashed with FTW3? It was stable through repeated timespy/port royal stress tests/benchmarks.


Maybe not with FTW3. I have the cheap Eagle OC from Gigabyte  But i am happy with [email protected],96v / 340W
Maybe if a BIOS is released to get 370W, i could get [email protected]


----------



## HyperMatrix

Vapochilled said:


> Maybe not with FTW3. I have the cheap Eagle OC from Gigabyte  But i am happy with [email protected],96v / 340W
> Maybe if a BIOS is released to get 370W, i could get [email protected]


I read some of the data on it. Could be a software/driver issue, but it seems to be disproportionately affecting cards with lower TDP and fewer power stages. and yeah 1970 vs 2070 is like 5% difference. Not worth paying so much more or worrying about it that much. At least for regular people. I have OCD so it’s a different case entirely.


----------



## Vapochilled

HyperMatrix said:


> I read some of the data on it. Could be a software/driver issue, but it seems to be disproportionately affecting cards with lower TDP and fewer power stages. and yeah 1970 vs 2070 is like 5% difference. Not worth paying so much more or worrying about it that much. At least for regular people. I have OCD so it’s a different case entirely.


To me it makes even less sense to pay those 150e more because im still using my old 6700k from 2015.
However, by playing 5k ultra settings, the difference to a 9900k is like 3% if that much
For 1080p then it would be like 80% less or so.

I guess this 6700k will hold more 1 or 2 years and maybe... even then, i would just mod the Z170 Asus BIOS for the coffee lake Pin mod. I saw people using 9900k on the Z170


----------



## sjd

HyperMatrix said:


> I read some of the data on it. Could be a software/driver issue, but it seems to be disproportionately affecting cards with lower TDP and fewer power stages. and yeah 1970 vs 2070 is like 5% difference. Not worth paying so much more or worrying about it that much. At least for regular people. I have OCD so it’s a different case entirely.


Might have something to do with those cards being more common than others. Hopefully it's just a driver issue, mine runs fine so far as long as I keep it at stock. At least I don't have to downclock it like some others have had to. I have a 3080 Gaming X Trio, going to slap a WB on it so decided to keep it instead of waiting for a TUF. Crossing my fingers for a VBIOS with a higher PL too.


----------



## gerardfraser

I run RTX a 3080 at 40 Core and 400 Memory depending on game ,you might get 2FPS-6FPS more than stock.
Depending on game I seen GPU clocks running from 1720 Mhz Dips- 2070Mhz when hitting the 355W to not using GPU at all.
Under volt to 0.8v gives around same results as stock minus 1FPS-2FPS while consuming less power.
Now I will be flashing a card with another BIOS for more power but they will be no gain in the overclocked FPS.


----------



## VPII

gerardfraser said:


> I run RTX a 3080 at 40 Core and 400 Memory depending on game ,you might get 2FPS-6FPS more than stock.
> Depending on game I seen GPU clocks running from 1720 Mhz Dips- 2070Mhz when hitting the 355W to not using GPU at all.
> Under volt to 0.8v gives around same results as stock minus 1FPS-2FPS while consuming less power.
> Now I will be flashing a card with another BIOS for more power but they will be no gain in the overclocked FPS.


Okay, please tell me how are you flashing a RTX 3080 card. From what I understand there is no software that can do this.


----------



## Vapochilled

How will you flash another bios? Do you have nflash already working ? Do you have other bios files ?


----------



## gerardfraser

I personally do not have a flash yet.When I do I will flash the cards


----------



## Professor McNasty

gerardfraser said:


> I personally do not have a flash yet.When I do I will flash the cards


I’m hoping for some unlocked vBios files soon for the EVGA XC3 cards.


----------



## sdmf74

Subbed


----------



## Imprezzion

Professor McNasty said:


> I’m hoping for some unlocked vBios files soon for the EVGA XC3 cards.


I'm hoping for just the card alone in the first place hahaa


----------



## Professor McNasty

Imprezzion said:


> I'm hoping for just the card alone in the first place hahaa


30+ days until they ship my XC Ultra Gaming. I’m right there with you, friend.


----------



## ghostofmurph

Is there any public way yet to see memory temps? Seems like stock memory temps on the FE are totally unsafe. What are the odds there will be a public tool to let us undervolt the memory?

Long into Vega's life there was a private tool to undervolt or overvolt memory, but I'm not sure if it ever ended up being publicly released


----------



## ttnuagmada

I chickened out on the 3090 and ended up backordering a strix OC and preordered one of the EK blocks. Have no idea when it will get here. Surely 10 gigs will last me a couple of years at 1440p.


----------



## Imprezzion

ghostofmurph said:


> Is there any public way yet to see memory temps? Seems like stock memory temps on the FE are totally unsafe. What are the odds there will be a public tool to let us undervolt the memory?
> 
> Long into Vega's life there was a private tool to undervolt or overvolt memory, but I'm not sure if it ever ended up being publicly released


Makes me wonder if the RAM even has internal temp monitoring or just a external sensor which tends to under read anyway.


----------



## asdkj1740

Professor McNasty said:


> I’m hoping for some unlocked vBios files soon for the EVGA XC3 cards.


the cooler is really bad, if not the worst among 3080. 
therefore it wont be any unlocked bios for more power limit for this card, the card just wont be able to handle it with stock cooling.


----------



## slopokdave

asdkj1740 said:


> the cooler is really bad, if not the worst among 3080.
> therefore it wont be any unlocked bios for more power limit for this card, the card just wont be able to handle it with stock cooling.


Those of us on water have entered the chat. 😁


----------



## Chamidorix

Some vBios files appearing on TPU......


----------



## wholeeo

What are you guys scoring with your out of the box FE's? I feel like something is wrong with my system,



https://www.3dmark.com/spy/14110198



Highest I can get on the graphics score after DDU, messing with settings, fan curves, overclocking, etc. Seems like people are easily getting over 17000 with these cards and out of the box I was scoring in the 15500 area.


----------



## Professor McNasty

asdkj1740 said:


> the cooler is really bad, if not the worst among 3080.
> therefore it wont be any unlocked bios for more power limit for this card, the card just wont be able to handle it with stock cooling.


I don’t think this is right. From all the reviews I’ve seen so far the XC3 Ultra is right up there with the TUF and other 2 cable AIBs.

Regardless, I plan on putting a waterblock on it anyway, and EVGA’s customer service is some of the best in the industry.


----------



## gerardfraser

wholeeo said:


> What are you guys scoring with your out of the box FE's? I feel like something is wrong with my system,
> 
> 
> 
> https://www.3dmark.com/spy/14110198
> 
> 
> 
> Highest I can get on the graphics score after DDU, messing with settings, fan curves, overclocking, etc. Seems like people are easily getting over 17000 with these cards and out of the box I was scoring in the 15500 area.


Is this the free version,I will download it and test it out on a RTX 3080.


----------



## asdkj1740

Professor McNasty said:


> I don’t think this is right. From all the reviews I’ve seen so far the XC3 Ultra is right up there with the TUF and other 2 cable AIBs.
> 
> Regardless, I plan on putting a waterblock on it anyway, and EVGA’s customer service is some of the best in the industry.


thats bs.

nice.


----------



## t1337dude

So...was anyone able to a slap a Morpheus 2 on their 3080?


----------



## wholeeo

gerardfraser said:


> Is this the free version,I will download it and test it out on a RTX 3080.


Thanks.


----------



## gerardfraser

wholeeo said:


> Thanks.


Sure no problem.Sorry just read the post again,you asked for out of the box and I do not have FE lol,be back to post score on default.
RTX 3080 Asus Tuf

Default 
GPU score- 17793



https://www.3dmark.com/3dm/50824768?






Spoiler



Core-40
Memory-400
GPU score-18492



https://www.3dmark.com/3dm/50824309


----------



## Professor McNasty

asdkj1740 said:


> thats bs.
> 
> nice.


I’m sorry, what? Can you use a few more words to explain what you mean?


----------



## asdkj1740

Professor McNasty said:


> I’m sorry, what? Can you use a few more words to explain what you mean?


80c








EVGA RTX3080 XC3 Ultra簡單開箱分享！ - Mobile01


之前升級電腦唯獨顯卡延用1080，就是在等3080、3090，快上市前就問了幾家認識的店但都沒消息，去抽EVGA的上市活動也沒中籤，然後剛上市就缺貨，還好有朋友幫我弄了一張，看YT很多影片測試效能了，也不差小弟一篇，沒事來簡單開個箱吧直接看圖說故事EVGA RTX3080 XC3(顯示卡 第1頁)




www.mobile01.com





another one, 77c










there is no way xc3 can be on par with tuf, not even close.

btw may i have the links of the reviews you looked? as my thread on r/hardware rambling about some reviews of evga xc3 dont dare to show the temp test, and got banned immediately.


----------



## wholeeo

gerardfraser said:


> Sure no problem.Sorry just read the post again,you asked for out of the box and I do not have FE lol,be back to post score on default.
> RTX 3080 Asus Tuf
> 
> Default
> GPU score- 17793
> 
> 
> 
> https://www.3dmark.com/3dm/50824768?
> 
> 
> 
> 
> 
> 
> Spoiler
> 
> 
> 
> Core-40
> Memory-400
> GPU score-18492
> 
> 
> 
> https://www.3dmark.com/3dm/50824309



I mean, jeez, look at this.



https://www.3dmark.com/compare/spy/14113024/spy/14113005#



Not sure what is bogging me down.


----------



## gerardfraser

wholeeo said:


> I mean, jeez, look at this.
> 
> 
> 
> https://www.3dmark.com/compare/spy/14113024/spy/14113005#
> 
> 
> 
> Not sure what is bogging me down.


Yeah that is crazy,looks like everything about your GPU is better than mine and I was even uploading two video and listening to a podcast on youtube. Makes no sense to me looking at the numbers.


----------



## Professor McNasty

asdkj1740 said:


> 80c
> 
> 
> 
> 
> 
> 
> 
> 
> EVGA RTX3080 XC3 Ultra簡單開箱分享！ - Mobile01
> 
> 
> 之前升級電腦唯獨顯卡延用1080，就是在等3080、3090，快上市前就問了幾家認識的店但都沒消息，去抽EVGA的上市活動也沒中籤，然後剛上市就缺貨，還好有朋友幫我弄了一張，看YT很多影片測試效能了，也不差小弟一篇，沒事來簡單開個箱吧直接看圖說故事EVGA RTX3080 XC3(顯示卡 第1頁)
> 
> 
> 
> 
> www.mobile01.com
> 
> 
> 
> 
> 
> another one, 77c
> View attachment 2459971
> 
> 
> 
> there is no way xc3 can be on par with tuf, not even close.
> 
> btw may i have the links of the reviews you looked? as my thread on r/hardware rambling about some reviews of evga xc3 dont dare to show the temp test, and got banned immediately.


These are random Chinese run benchmarks...what fan RPM were they running? Did they modify the fan curve? Do we even know if the TUF and the XC3 have the same fan curve?

BPS was able to get +115 core and +600 memory using a modified fan curve and didn’t get heat throttled. He hit power limits before he hit heat:


----------



## gerardfraser

All the RTX 3080's are pretty much the same, crap overclocks and run cool and quite as far as i can tell.


----------



## Professor McNasty

gerardfraser said:


> All the RTX 3080's are pretty much the same, crap overclocks and run cool and quite as far as i can tell.


Yep. This is exactly what I’ve been hearing too. I wish there were more AIB reviews but I’m sure as stock becomes available we will see more videos pop up.


----------



## gerardfraser

Well I can say I used 3 different 3080's and there all the same to me.


----------



## asdkj1740

Professor McNasty said:


> These are random Chinese run benchmarks...what fan RPM were they running? Did they modify the fan curve? Do we even know if the TUF and the XC3 have the same fan curve?
> 
> BPS was able to get +115 core and +600 memory using a modified fan curve and didn’t get heat throttled. He hit power limits before he hit heat:


i heard that from jayztwocents too, he said only after >thermal target like 83c should be considered as thermal throttled.


that would be even worse if those chinese were using custom fan curve to boost up the fan rpm.


another user, 75c

__
https://www.reddit.com/r/nvidia/comments/ivqjyt


----------



## Professor McNasty

asdkj1740 said:


> i heard that from jayztwocents too, he said after >thermal target like 83c should only be considered as thermal throttled.
> 
> 
> that would be even worse if those chinese were using custom fan curve to boost up the fan rpm.
> 
> 
> another user, 75c
> 
> __
> https://www.reddit.com/r/nvidia/comments/ivqjyt


In the same thread someone said that Newegg had also had 4 EVGA fail thermal tests but could provide no proof and then they said it was fixed with the firmware update they put out that fixed the fan/coil whine.

Do we have any credible and reputable sources that can back up this claim of the EVGA running this hot, especially after the update and with fan RPM numbers?


----------



## asdkj1740

Professor McNasty said:


> In the same thread someone said that Newegg had also had 4 EVGA fail thermal tests but could provide no proof and then they said it was fixed with the firmware update they put out that fixed the fan/coil whine.
> 
> Do we have any credible and reputable sources that can back up this claim of the EVGA running this hot, especially after the update and with fan RPM numbers?


we will see. but one thing for certain, tuf is still way better than evga xc3.

when big reviewers are shy enough to not show you the temp and fan speed and power consumption, something is wrong


----------



## Professor McNasty

asdkj1740 said:


> we will see. but one thing for certain,tuf is still way better than evga xc3.
> 
> when big reviewers are shy enough to not show you the temp and fan speed and power consumption, something is wrong


----------



## Shadowdane

wholeeo said:


> I mean, jeez, look at this.
> 
> 
> 
> https://www.3dmark.com/compare/spy/14113024/spy/14113005#
> 
> 
> 
> Not sure what is bogging me down.


Shows right in the score graphics details.. his average clockspeeds are about ~90Mhz higher. 

Clock frequency: 2,040 MHz vs 1,995 MHz
Average clock frequency: 1,949 MHz vs 1,859 MHz

Also the CPU is faster which would help to some extent as well.


----------



## VPII

I find it somewhat interesting. I called out Guru3d in one or two threads regarding their review of the Palit RTX 3080 GamingPro OC and wanted to comment about it now as their results were pretty poor. I did however notice that when entering the review it states the GamingPro OC but when you go further into the review it states only GamingPro which might be the case then. However, even if it is only the GamingPro the 3dmark Time Spy result is pretty poor. This morning however when I searched again for the Palit RTX 3080 GamingPro OC review and entered the Guru3d's review I was totally shocked at what they give as results in 3dmark Time Spy and 3dmark Fire Strike Ultra.

Now the 3dmark Time Spy overall result looks somewhat in line with what I have seen with mine, but their graphics scores are way out of line. Go and look in the 3dmark database and search results for 3dmark Time Spy with the RTX 3080 and check the graphics scores showed right next to overall score. I'll post one below.



https://www.3dmark.com/spy/14086205



Now think about it, the average temp shown is 54c so it is not under LN2 however the average clocks is 2034mhz from a maximum clock of 2055mhz. This would mean that the only drop in clocks this run had was as a result of the temps, not power. If power was limited for the card at say the limit on the Palit RTX 3080 GamingPro OC which is 350watt if maxxed out, then that average clocks would have been just below 1900mhz.



https://www.3dmark.com/spy/14114514



Look my stock result is usually around 17600 or higher but I figured I'll put this one just to show average clocks.


----------



## VPII

VPII said:


> I find it somewhat interesting. I called out Guru3d in one or two threads regarding their review of the Palit RTX 3080 GamingPro OC and wanted to comment about it now as their results were pretty poor. I did however notice that when entering the review it states the GamingPro OC but when you go further into the review it states only GamingPro which might be the case then. However, even if it is only the GamingPro the 3dmark Time Spy result is pretty poor. This morning however when I searched again for the Palit RTX 3080 GamingPro OC review and entered the Guru3d's review I was totally shocked at what they give as results in 3dmark Time Spy and 3dmark Fire Strike Ultra.
> 
> Now the 3dmark Time Spy overall result looks somewhat in line with what I have seen with mine, but their graphics scores are way out of line. Go and look in the 3dmark database and search results for 3dmark Time Spy with the RTX 3080 and check the graphics scores showed right next to overall score. I'll post one below.
> 
> 
> 
> https://www.3dmark.com/spy/14086205
> 
> 
> 
> Now think about it, the average temp shown is 54c so it is not under LN2 however the average clocks is 2034mhz from a maximum clock of 2055mhz. This would mean that the only drop in clocks this run had was as a result of the temps, not power. If power was limited for the card at say the limit on the Palit RTX 3080 GamingPro OC which is 350watt if maxxed out, then that average clocks would have been just below 1900mhz.
> 
> 
> 
> https://www.3dmark.com/spy/14114514
> 
> 
> 
> Look my stock result is usually around 17600 or higher but I figured I'll put this one just to show average clocks.


Okay let me just add, the run I gave from my system was with fan set to 75% but this run I give now is with fan set at stock.



https://www.3dmark.com/spy/14114982



This result is what I have managed with a +90mhz core overclock and +750mhz memory overclock. Now look at the average clock stated by 3dmark, I mean 1955mhz from 2190mhz max clock. The power limit on this card will throw your clock back a lot.



https://www.3dmark.com/spy/14087792


----------



## gerardfraser

Shadowdane said:


> Shows right in the score graphics details.. his average clockspeeds are about ~90Mhz higher.
> 
> Clock frequency: 2,040 MHz vs 1,995 MHz
> Average clock frequency: 1,949 MHz vs 1,859 MHz
> 
> Also the CPU is faster which would help to some extent as well.


You got the scores mixed up,The AMD CPU and RTX 3080 has the higher GPU score and lower GPU clocks.


----------



## shiokarai

Any info on the BIOSes dumping/flashing/modding this time around? Possible? Not possible at all? maybe?


----------



## wholeeo

Shadowdane said:


> Shows right in the score graphics details.. his average clockspeeds are about ~90Mhz higher.
> 
> Clock frequency: 2,040 MHz vs 1,995 MHz
> Average clock frequency: 1,949 MHz vs 1,859 MHz
> 
> Also the CPU is faster which would help to some extent as well.


The 2040 card is mine.


----------



## slopokdave

wholeeo said:


> The 2040 card is mine.


Yeah GPU score is actually higher on AMD system. I'm thinking you should be using Port Royal if you want to just compare gpu to gpu.


----------



## bigjdubb

Has anyone done some folding since getting their 3080? I'm curious to see how they do.


----------



## Avacado

bigjdubb said:


> Has anyone done some folding since getting their 3080? I'm curious to see how they do.


No, but I found some info here Folding Forum • View topic - GeForce RTX 3080 and 3090 support enabled ! Looks to be 4 Mil PPD+ (Only @39% GPU load)


----------



## sakete

bigjdubb said:


> Has anyone done some folding since getting their 3080? I'm curious to see how they do.


I folded my laundry this morning. Went pretty well.


----------



## Avacado

sakete said:


> I folded my laundry this morning. Went pretty well.


How many shirts per day were you putting out?


----------



## gerardfraser

The maid does the folding of the shirts, while I play some games in 8K on RTX 3080


----------



## bigjdubb

sakete said:


> I folded my laundry this morning. Went pretty well.


I stopped doing that a long time ago.



Avacado said:


> No, but I found some info here Folding Forum • View topic - GeForce RTX 3080 and 3090 support enabled ! Looks to be 4 Mil PPD+ (Only @39% GPU load)


4mil ppd at 39% seems promising, assuming it will rise once the gpu can be fully utilized.


----------



## doom26464

I had memory express call me a week later after my pre order as I had a pre order on a asus tuf, asus strix and evga ftw. They said they where enforcing a 1 card per customer rule so I had to pick one. I went with the evga FTW hopefully its a decent card. Power limit on them is 120% right? Also hopefully the 2.75 cooler is decent. Evga cards usually are ok but not the best but then again I have always had there mid tier or lower stuff not there FTW. There customer support however is top notch.

No time line on when pre order even gets shipped so not great. If I can find a card before then ill just cancel. Memory express is usually pretty bad with getting stock on new realses i find.


----------



## Professor McNasty

doom26464 said:


> I had memory express call me a week later after my pre order as I had a pre order on a asus tuf, asus strix and evga ftw. They said they where enforcing a 1 card per customer rule so I had to pick one. I went with the evga FTW hopefully its a decent card. Power limit on them is 120% right? Also hopefully the 2.75 cooler is decent. Evga cards usually are ok but not the best but then again I have always had there mid tier or lower stuff not there FTW. There customer support however is top notch.
> 
> No time line on when pre order even gets shipped so not great. If I can find a card before then ill just cancel. Memory express is usually pretty bad with getting stock on new realses i find.


They weren't able to provide a time line at all? Doesn't seem like good customer service. I'll probably be reaching out in another week or so to get a time line for my 3080.


----------



## doom26464

Professor McNasty said:


> They weren't able to provide a time line at all? Doesn't seem like good customer service. I'll probably be reaching out in another week or so to get a time line for my 3080.


Nope none. They said they had no idea.

Also had a pretty poor pre order experience with my 9900k. Ended up finding one from an online retailer far before memory express filled my pre order. Im think they get poor stock alllocation on product launches0


----------



## vigorito

We should buy cheapest 3080


----------



## ElectroManiac

Manage to grab a Gigabyte Gaming OC a few days ago from Newegg. Tracking says it will be here on Tuesday.

Anyone here with one of those cards? Just curious how is performing for you


----------



## Chargeit

Well, might of lucked out if you missed out on catching a 3080,


----------



## Alemancio

Chargeit said:


> Well, might of lucked out if you missed out on catching a 3080,


If you dont watch this video and plan to buy a 3080, best of luck to you...


----------



## Chargeit

Alemancio said:


> If you dont watch this video and plan to buy a 3080, best of luck to you...


Agree. I almost skipped the video because I stopped watching J after he shilled Rtx but happy I decided to click on this one. My hunt for a 3080 has come to an end. Unless I see an overbuilt FE for $699.


----------



## HyperMatrix

doom26464 said:


> I had memory express call me a week later after my pre order as I had a pre order on a asus tuf, asus strix and evga ftw. They said they where enforcing a 1 card per customer rule so I had to pick one. I went with the evga FTW hopefully its a decent card. Power limit on them is 120% right? Also hopefully the 2.75 cooler is decent. Evga cards usually are ok but not the best but then again I have always had there mid tier or lower stuff not there FTW. There customer support however is top notch.
> 
> No time line on when pre order even gets shipped so not great. If I can find a card before then ill just cancel. Memory express is usually pretty bad with getting stock on new realses i find.



Just an FYI...Memory Express preorders are on a per-rep/store basis. One store won't know about preorders at another location. So if you want more than 1 card, place a different preorder at each store. They'll hold the cards for 2 days after they become available. Or you can pay in advance to avoid any potential issues like that.


----------



## bkrownd

Chargeit said:


> Well, might of lucked out if you missed out on catching a 3080,


If you were planning on rolling the dice with Zotac, LOLtac


----------



## HyperMatrix

bkrownd said:


> If you were planning on rolling the dice with Zotac, LOLtac


Zhrooms, the thread starter, assures me that the Zotac is an absolutely amazing card and just as good as any of the others. Fortunately, God gave me a brain and my parents taught me how to use it.


----------



## Chargeit

bkrownd said:


> If you were planning on rolling the dice with Zotac, LOLtac


From what he says, this is an issues across brands though some are likely to be worse then others.


----------



## bkrownd

Chargeit said:


> From what he says, this is an issues across brands though some are likely to be worse then others.


Brands that cheap out on the components...like Zotac


----------



## gerardfraser

LOL you can not make this stuff up.


----------



## HyperMatrix




----------



## Chargeit

Newegg has Evga Ftw3 in stock if anyone wants to try their luck,









EVGA GeForce RTX 3080 FTW3 ULTRA GAMING Video Card - Newegg.com


Buy EVGA GeForce RTX 3080 FTW3 ULTRA GAMING Video Card, 10G-P5-3897-KR, 10GB GDDR6X, iCX3 Technology, ARGB LED, Metal Backplate with fast shipping and top-rated customer service. Once you know, you Newegg!




www.newegg.com


----------



## ChaosBlades

Chargeit said:


> Newegg has Evga Ftw3 in stock if anyone wants to try their luck,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EVGA GeForce RTX 3080 FTW3 ULTRA GAMING Video Card - Newegg.com
> 
> 
> Buy EVGA GeForce RTX 3080 FTW3 ULTRA GAMING Video Card, 10G-P5-3897-KR, 10GB GDDR6X, iCX3 Technology, ARGB LED, Metal Backplate with fast shipping and top-rated customer service. Once you know, you Newegg!
> 
> 
> 
> 
> www.newegg.com


and its gone.


----------



## t1337dude

Wow, NVIDIA couldn't have possibly botched this launch any harder, could they? They're going to piss off everyone who hasn't gotten a card, as well as everyone who HAS gotten a card.


----------



## Professor McNasty

HyperMatrix said:


> View attachment 2460055


EVGA is hands down the best company to buy from. This level of communication is amazing.


----------



## Awsan

But can we fix how ugly that card looks mr.Evga man


----------



## MacMus

I'd like to buy sometime 3080, but lettings all this early adopters have fun with it first and discover all the bugs.

Which card have the best caps I heard there is a huge drama about people crashing games etc.


----------



## MacMus

EVGA is bad caps, Zoltac too ... who elese...

Only asus can stand above 2000?


----------



## HyperMatrix

MacMus said:


> EVGA is bad caps, Zoltac too ... who elese...
> 
> Only asus can stand above 2000?


EVGA isn’t bad caps. The highest clocking 3080 on air so far has been the FTW3 with a score of 12460 in port royal and clocks hitting 2070MHz with stock bios on air. And this was a reviewer card that probably had 6 poscaps, while all units sold to customers are using 4x poscaps and 2, mlcc clusters.


----------



## VPII

After watching the Jayztwocents video regarding this 3080 crashing issue. I wanted to check the Palit RTX 3080 GamingPro OC to see what they have at the back. Unfortunately the back plate is completely covering the GPU and as such had to be removed. I had to leave in one screw with a sticker on it but was not too much of an issue. I found that this card does in fact have on capacitor array of 10 of the smaller more expensive capacitors so all good it seem. Probably why I am able to clock the card up to 2145 to 2175mhz. Which does not help except increase avarage clocks.


----------



## MacMus

HyperMatrix said:


> EVGA isn’t bad caps. The highest clocking 3080 on air so far has been the FTW3 with a score of 12460 in port royal and clocks hitting 2070MHz with stock bios on air. And this was a reviewer card that probably had 6 poscaps, while all units sold to customers are using 4x poscaps and 2, mlcc clusters.


so wait they made diffrent board for reviewer and retail ***?

i heard EVGA is also crashing.


----------



## HyperMatrix

MacMus said:


> so wait they made diffrent board for reviewer and retail ***?
> 
> i heard EVGA is also crashing.


They originally made them with 6 poscaps but it didn’t pass QC. They redesigned them with a 2 + 4 design for FTW3 and 1 + 5 design for the XC model or whatever the cheap one is called. If FTW3 is crashing it’s unrelated to a hardware design issue or simply higher clock than the die can handle.

Reviewers got original build models with 6 poscaps because consumer models with the new designs weren’t ready at the time. Nobody that purchased an FTW3 got 6 poscaps.


----------



## keng

I felt a bit duped with the pricing. ANd the 2080ti pricing dint drop...but don't tell anyone


----------



## ElectroManiac

All this mess with the caps kind make me regrets getting the Gigabyte


----------



## HyperMatrix

ElectroManiac said:


> All this mess with the caps kind make me regrets getting the Gigabyte


On the plus side...you can always sell it for more than you bought it if you happen to get your hands on another one you like.


----------



## acrvr

Rbk_3 said:


> Ended up selling mine. Was having way too many issues in Warzone, I’m going to wait for more mature drivers and get the actual card I want. Might pick up an EVGA 2070S and get in que to step up to a FTW3 Ultra. Camped out for 12 hours for a whole lot of headache.
> 
> Sent from my iPhone using Tapatalk


Which card was this and what issue in Warzone?


----------



## Nizzen

Awsan said:


> But can we fix how ugly that card looks mr.Evga man


Look at youre screen while gaming, instead of looking in the case. Fixed


----------



## Nizzen

Rbk_3 said:


> Ended up selling mine. Was having way too many issues in Warzone, I’m going to wait for more mature drivers and get the actual card I want. Might pick up an EVGA 2070S and get in que to step up to a FTW3 Ultra. Camped out for 12 hours for a whole lot of headache.
> 
> Sent from my iPhone using Tapatalk


Picture or it did not happend 
What gpu model?


----------



## HyperMatrix

Anyone have an EVGA RTX 3080 FTW3? I know EVGA stated the card has a 420W power limit. But I noticed during live benching stream, it didn't go over a spike of 406W. I assumed it was just a problem with GPU-Z reporting. But I just saw the vbios of the card was uploaded to techpowerup 3 days ago, and it shows a max limit of 400W, which would explain why it just peaked slightly above the 400W limit to 406W, and never went higher towards the 420W limit. The bios is from a review sample model, which likely had the 6x poscap design. It's possible those models were capped at 400W and retail units have the full 420W. Would be interesting to confirm.


----------



## Diffident

HyperMatrix said:


> Anyone have an EVGA RTX 3080 FTW3? I know EVGA stated the card has a 420W power limit. But I noticed during live benching stream, it didn't go over a spike of 406W. I assumed it was just a problem with GPU-Z reporting. But I just saw the vbios of the card was uploaded to techpowerup 3 days ago, and it shows a max limit of 400W, which would explain why it just peaked slightly above the 400W limit to 406W, and never went higher towards the 420W limit. The bios is from a review sample model, which likely had the 6x poscap design. It's possible those models were capped at 400W and retail units have the full 420W. Would be interesting to confirm.
> 
> View attachment 2460105


How do we know what BIOS that is? The FTW cards have 2 BIOS's, a standard and an overclocked.


----------



## HyperMatrix

Diffident said:


> How do we know what BIOS that is? The FTW cards have 2 BIOS's, a standard and an overclocked.


Well it's still listing the 1800MHz boost clock. I'm not sure how the regular/overclocked bios affect max power limits that are accessible only through overclocking and using the slider. It may or may not cap that too. I'm just asking for confirmation of this by someone who has the card because I see 400W here, and during Gamer Nexus's live OC stream, the card maxed out never went above 406W max.

For example with the ROG Strix 3090, they've uploaded both the quiet and oc bios and here they are. Both show the same power limit/boost clocks.


----------



## Diffident

HyperMatrix said:


> Well it's still listing the 1800MHz boost clock. I'm not sure how the regular/overclocked bios affect max power limits that are accessible only through overclocking and using the slider. It may or may not cap that too. I'm just asking for confirmation of this by someone who has the card because I see 400W here, and during Gamer Nexus's live OC stream, the card maxed out never went above 406W max.


I don't know if EVGA lists the standard boost or the overclocked boost.


----------



## HyperMatrix

Diffident said:


> I don't know if EVGA lists the standard boost or the overclocked boost.


Bios would show maximum power. There's no way for them to go to a higher power level than what the bios dump shows. They can change the boost clock/fan curves to limit power usage. But they can't go above that set limit. I'm trying to see if retail cards are properly showing the 420W that EVGA claimed or not because the review cards were at 400W.


----------



## asdkj1740

colorful advance has 470w bios, once the oc button on the i/o shield is pressed.


----------



## HyperMatrix

asdkj1740 said:


> colorful advance has 470w bios, once the oc button on the i/o shield is pressed.
> View attachment 2460111


That's the highest TDP on a 3080 by far....wowzers. Is that your card? I'm not able to find anything on this 470W anywhere on the interwebs.


----------



## cstkl1

no problem here on tuf.. heck it even spiked to 2130...

since ppl are claiming metro exodus can cause crashes


----------



## asdkj1740

HyperMatrix said:


> That's the highest TDP on a 3080 by far....wowzers. Is that your card? I'm not able to find anything on this 470W anywhere on the interwebs.


no, i saw someone posted that.
jayztwocents in his latest video about the crashes said colorful doesnt want that button to be used.


----------



## shallow_

First 3080 relisted, a Zotac, because of an 'unserious buyer'. Sounds like a smart buyer to me. Only listed at cost + a little on top. Run scalpers.. run


----------



## doom26464

I don't think evga has sent out production runs of there ftw3 cards get. Just review samples to reviewers as of now.


----------



## wholeeo

Looks like I’ll either be returning my 3080 FE or selling it. I can’t for the life of me get the card to score what others have out of the box. Normally this wouldn’t bother me but I compared my scores to another user on here with specs lower than mine and on one of the Graphic Tests in Time Spy he was achieving nearly 10 FPS more than my system. I’ve tried overclocking, undervolting, fresh install of windows, ddu, etc to no avail.


----------



## gerardfraser

cstkl1 said:


> no problem here on tuf.. heck it even spiked to 2130...
> since ppl are claiming metro exodus can cause crashes


OKthis is not a pissing contest,just some advice for you to try if you want.

It looks like you have your RTX Asus Tuf core and memory set too high and your losing performance.I would say it the memory that is the main problem here.

So I tried your settings and I got better results with lower GPU settings.

AMD 3600XT -4600 Mhz 
Asus Tuf RTX 3080
Core- +40
Memory-+600

Matched Metro Exodus settings in video on my AMD rig I recorded with Nvidia Shadowplay but did not bother to upload.
From run 1
Max-124.52 FPS
AVG-89.90
Min-50.89

Your settings in video.
From run 1
Max-120.85 FPS
AVG-80.53
Min-47.07

Now the reason why I choose these settings.The MAX power limit on the ASUS Tuf is 370W and the sweet spot is 340W-350W. So when limits are exceeded then Nvidia GPU does it's thing. I am also not convinced that even with 470W BIOS lets say that performance still might not increase.


----------



## finalheaven

wholeeo said:


> Looks like I’ll either be returning my 3080 FE or selling it. I can’t for the life of me get the card to score what others have out of the box. Normally this wouldn’t bother me but I compared my scores to another user on here with specs lower than mine and on one of the Graphic Tests in Time Spy he was achieving nearly 10 FPS more than my system. I’ve tried overclocking, undervolting, fresh install of windows, ddu, etc to no avail.


That is very odd. Have you tried lowering your O/Cs? Both CPU and Ram. I am guessing you're pushing everything to the edge, but sometimes everything together combined causes issues. Might as well try to lower them and run the benchmarks. Under 5.0ghz and 3600C16 and try. Can't hurt...


----------



## shiokarai

finalheaven said:


> That is very odd. Have you tried lowering your O/Cs? Both CPU and Ram. I am guessing you're pushing everything to the edge, but sometimes everything together combined causes issues. Might as well try to lower them and run the benchmarks. Under 5.0ghz and 3600C16 and try. Can't hurt...


GDDR6x has error correction this time around so memory won't display any visual artifacts it will just lose performance if you OC it too much, beyond the point of stability


----------



## wholeeo

finalheaven said:


> That is very odd. Have you tried lowering your O/Cs? Both CPU and Ram. I am guessing you're pushing everything to the edge, but sometimes everything together combined causes issues. Might as well try to lower them and run the benchmarks. Under 5.0ghz and 3600C16 and try. Can't hurt...





https://www.3dmark.com/3dm/50923743?




The latest I have been able to score on Port Royal. My score is a bit above average but still at the bottom 1%

I'm going to try testing again with bios optimized defaults.

edit:

Bios Defaults, XMP Enabled, 3080 FE Default (custom fan curve)



https://www.3dmark.com/3dm/50924955?


----------



## outofmyheadyo

Professor McNasty said:


> EVGA is hands down the best company to buy from. This level of communication is amazing.


Failed to replay to an email for a week, amazing indeed!


----------



## Professor McNasty

outofmyheadyo said:


> Failed to replay to an email for a week, amazing indeed!


I once called them and they sent out a free fan replacement for my motherboard's northbridge, even after the warranty period ended.


----------



## Chamidorix

HyperMatrix said:


> Anyone have an EVGA RTX 3080 FTW3? I know EVGA stated the card has a 420W power limit. But I noticed during live benching stream, it didn't go over a spike of 406W. I assumed it was just a problem with GPU-Z reporting. But I just saw the vbios of the card was uploaded to techpowerup 3 days ago, and it shows a max limit of 400W, which would explain why it just peaked slightly above the 400W limit to 406W, and never went higher towards the 420W limit. The bios is from a review sample model, which likely had the 6x poscap design. It's possible those models were capped at 400W and retail units have the full 420W. Would be interesting to confirm.


The power limit was bumped up if you check through EVGA Jacobs twitter history. I'm almost certain it was due to the capacitor redesign. People definitely have 3080 ftws now but haven't seen any actual power limit screenshots. My ftw 3090 is just sitting in the shipping center here in town, closed down until Monday. Infuriating.


----------



## Diverge

wholeeo said:


> https://www.3dmark.com/3dm/50923743?
> 
> 
> 
> 
> The latest I have been able to score on Port Royal. My score is a bit above average but still at the bottom 1%
> 
> I'm going to try testing again with bios optimized defaults.
> 
> edit:
> 
> Bios Defaults, XMP Enabled, 3080 FE Default (custom fan curve)
> 
> 
> 
> https://www.3dmark.com/3dm/50924955?


I don't think there is anything wrong with your card. Seems comparable to mine, and I'm not OCing it, just raised power level and custom fan curve. Mines in an air cooled Ghost S1. https://www.3dmark.com/compare/pr/339267/pr/325034/pr/339384


----------



## gerardfraser

Diverge said:


> I don't think there is anything wrong with your card. Seems comparable to mine, and I'm not OCing it, just raised power level and custom fan curve. Mines in an air cooled Ghost S1. https://www.3dmark.com/compare/pr/339267/pr/325034/pr/339384


He is upset that lesser computer running RTX 3080 is getting better score than his. I would be perplexed also. So for example my AMD rig gets higher score on lower clocks than the Intel 10900K beast with RTX 3080. 
Also on this page another Intel 10900K with higher clocks gets less performance than my AMD system.That also would be a pain in the behind. For me I guess I could put a RTX 3080 in an intel machine and see if I get the same results as the other Intels.

For me I would suggest the Intel machine's should at least be equal to AMD machine with RTX 3080 installed.


----------



## fockwulf

Diverge said:


> I don't think there is anything wrong with your card. Seems comparable to mine, and I'm not OCing it, just raised power level and custom fan curve. Mines in an air cooled Ghost S1. https://www.3dmark.com/compare/pr/339267/pr/325034/pr/339384


I did reach similar scores on my asus Tuf no OC (i7 5930k @4.3)

Then I opened the case and put a big fan in front of it. i was able to reach 11930 with +50/+600 and maxed power limit... I assume you need watercolling or a big AC unit to reach better scores.


----------



## cstkl1

gerardfraser said:


> OKthis is not a pissing contest,just some advice for you to try if you want.
> 
> It looks like you have your RTX Asus Tuf core and memory set too high and your losing performance.I would say it the memory that is the main problem here.
> 
> So I tried your settings and I got better results with lower GPU settings.
> 
> AMD 3600XT -4600 Mhz
> Asus Tuf RTX 3080
> Core- +40
> Memory-+600
> 
> Matched Metro Exodus settings in video on my AMD rig I recorded with Nvidia Shadowplay but did not bother to upload.
> From run 1
> Max-124.52 FPS
> AVG-89.90
> Min-50.89
> 
> Your settings in video.
> From run 1
> Max-120.85 FPS
> AVG-80.53
> Min-47.07
> 
> Now the reason why I choose these settings.The MAX power limit on the ASUS Tuf is 370W and the sweet spot is 340W-350W. So when limits are exceeded then Nvidia GPU does it's thing. I am also not convinced that even with 470W BIOS lets say that performance still might not increase.


Learn to oc , read specs etc. Test obs etc.

Its done via desktop capture in obs at 4:4:4 nvenc bt 709 at 50mbps. What do you think happens to fps

advice mode on.

hint to noobs.Gddr6x is rated at 21gbps.
Very few failed and is 19gbps.

Good ones do 21.5gbps stable.. mine does
And it degrades at 22gbps.
This is not my max stable core. Thats +120 on and max bench-able is +150

+90/1000 should be achievable by most 3080

Advice mode off


----------



## ZealotKi11er

shiokarai said:


> GDDR6x has error correction this time around so memory won't display any visual artifacts it will just lose performance if you OC it too much, beyond the point of stability


Just to correct this. Nvidia does not have error correct memory. Also it not part of GDDR6X. What they have is Error Detection and Replay. This is part of their memory controller. This allows for the memory controller to replay the bad data until it gets it right[since you are basically stuck on this until good data you lose performance,]. Thre is a limit which is most likely 3-5 tries and after that you hang/reset (CTD). ECC is completely different.


----------



## DokoBG

Gerard with the AMD Ryzen 3600 - slaying 10900K left and right.... at lower clocks !!! Literal GOD. 😃


----------



## gerardfraser

DokoBG said:


> Gerard with the AMD Ryzen 3600 - slaying 10900K left and right.... at lower clocks !!! 😃


Well just look at the last couple pages lol. You think my 10900K is not jealous.It is so not jealous I only use AMD for 4K PC gaming.My ego is fine.I am done with the stupid.


----------



## cstkl1

the whole point of the video is to educate ppl

1. dont buy pre oced card tuf for example . asic bin is more important
2. asus nvr bins gpu to only reserves them for their highest end. only evga does that
3. its the first video to show tuf boosting at metro. based on igor, and few othet youtubers.. this the benchmark they used to recreate the boosting driver fail issue. dlss because it runs at lower rendering pixel the cards boost higher. 
4. its not intended as a benchmark. why the heck would i record in 4:4:4 bt709 50mbps lossless. its to maintain highest graphic fidelity so ppl can view "what i see is what you get"


----------



## cstkl1

btw here tuf oc variant are usd 100 more

tuf base version is the cheapest card in the market. 

will post again with strix 3080 and strix 3090. 

out.


----------



## VPII

Okay after reading about undervolting the RTX 3080 to get higher sustaned boost clocks during benchmarks I dropped Evga Precision X1 and installed MSI Afterburner. Interestingly when pressing CTRL F it opens the voltage curve so I decided instead of messing with it myself I'll rather let it do the OC Scan. I then ran 3D Mark Time Spy to see what the difference was compared to the first run I did at stock with only power limit increase.

This here was my first run at stock, don't actually remember if the power limit was increased.


https://www.3dmark.com/spy/14046561



This next run was after doing the OC Scan on the card, but with power limit increased - check the difference in average clocks speed. Sorry only realised now that cpu was also running 300mhz higher for this run, but still look at the graphics score.


https://www.3dmark.com/spy/14152568



I also wanted to run something a little less demanding so I ran 3D Mark Fire Strike just to see the average clocks. I mean with this run at stock I almost reached my top result with the card core overclocked to 2130mhz.


https://www.3dmark.com/3dm/50941462?


----------



## wholeeo

cstkl1 said:


> typical answer from a dude who think he knows it all but doesnt and wants to advice others when he is lacking yet if others answers him in the manner he does he suddenly acts like a victim.
> 
> waste of everybody time.
> 
> also any dude who records from shadowplay.. thats another waste of ppl time watching to garbage quality videos...


----------



## cstkl1

wholeeo said:


> View attachment 2460175


nice. dont compare results dude. atm 3080 even a nerfed zotac one is better one in da hand than nothing at all

the card a beast and temps so cool.

last night it was raining whole night. woke up in the morning and saw it idling at 24c. for Malaysian weather its insane.


----------



## cstkl1

VPII said:


> Okay after reading about undervolting the RTX 3080 to get higher sustaned boost clocks during benchmarks I dropped Evga Precision X1 and installed MSI Afterburner. Interestingly when pressing CTRL F it opens the voltage curve so I decided instead of messing with it myself I'll rather let it do the OC Scan. I then ran 3D Mark Time Spy to see what the difference was compared to the first run I did at stock with only power limit increase.
> 
> This here was my first run at stock, don't actually remember if the power limit was increased.
> 
> 
> https://www.3dmark.com/spy/14046561
> 
> 
> 
> This next run was after doing the OC Scan on the card, but with power limit increased - check the difference in average clocks speed. Sorry only realised now that cpu was also running 300mhz higher for this run, but still look at the graphics score.
> 
> 
> https://www.3dmark.com/spy/14152568
> 
> 
> 
> I also wanted to run something a little less demanding so I ran 3D Mark Fire Strike just to see the average clocks. I mean with this run at stock I almost reached my top result with the card core overclocked to 2130mhz.
> 
> 
> https://www.3dmark.com/3dm/50941462?


nice 

msi ab try using the mystic skin. its easier to navigate.


----------



## shiokarai

ZealotKi11er said:


> Just to correct this. Nvidia does not have error correct memory. Also it not part of GDDR6X. What they have is Error Detection and Replay. This is part of their memory controller. This allows for the memory controller to replay the bad data until it gets it right[since you are basically stuck on this until good data you lose performance,]. Thre is a limit which is most likely 3-5 tries and after that you hang/reset (CTD). ECC is completely different.


Thanks for correction  isn't this the same in the end ie. as long as you don't ctd benchmarking "replaying" the bad data will lower your scores due to memory instability?


----------



## Professor McNasty

keng said:


> ...Titan RTX beats the 3080. And that is without messing with the power/algo.
> You will not get 144 frames at 4k. That is not going to fly, especially as sli is muerte
> 
> In case you are playing the home game,
> these ASICs are machine learning left overs rebranded to fill needs which do not exist.
> 
> At which point did anyone ever ask for raytracing or RT cores?
> Never. This is not the solution to any problem, except industry. And that is ok.


I’ve actually been asking for raytracing for a while now. I saw what the technology could do for the realism in games and was excited that we saw it adopted so quickly by Nvidia with the launch of the RTX series.


----------



## HyperMatrix

keng said:


> ...Titan RTX beats the 3080. And that is without messing with the power/algo.
> You will not get 144 frames at 4k. That is not going to fly, especially as sli is muerte
> 
> In case you are playing the home game,
> these ASICs are machine learning left overs rebranded to fill needs which do not exist.
> 
> At which point did anyone ever ask for raytracing or RT cores?
> Never. This is not the solution to any problem, except industry. And that is ok.



The RTX Titan does not beat the 3080....not to mention the $2500 for the small performance gains it had over the 2080ti. The RTX 3080 FE, not some super fancy model, is faster than the RTX Titan. This is not even debated anywhere. I'm not sure how you even came to make such a statement. Every single benchmark out there shows the 3080 is clearly ahead.

As for your comments on RT...you're really starting to talk out your rear mate.


----------



## Nizzen

keng said:


> ...Titan RTX beats the 3080. And that is without messing with the power/algo.
> You will not get 144 frames at 4k. That is not going to fly, especially as sli is muerte
> 
> In case you are playing the home game,
> these ASICs are machine learning left overs rebranded to fill needs which do not exist.
> 
> At which point did anyone ever ask for raytracing or RT cores?
> Never. This is not the solution to any problem, except industry. And that is ok.


In what game does Titan RTX beat 3080? I want to try that game with my new Palit 3080  Can't wait to try that game


----------



## pewpewlazer

keng said:


> ...Titan RTX beats the 3080. And that is without messing with the power/algo.
> *You will not get 144 frames at 4k. *That is not going to fly, especially as sli is muerte
> 
> In case you are playing the home game,
> these ASICs are machine learning left overs rebranded to fill needs which do not exist.
> 
> At which point did anyone ever ask for raytracing or RT cores?
> Never. This is not the solution to any problem, except industry. And that is ok.


At least there was SOMETHING factual in your post I guess...


----------



## HyperMatrix

pewpewlazer said:


> At least there was SOMETHING factual in your post I guess...


Even that's not entirely true. Although I'm personally targeting 98Hz right now, because the monitor doesn't support DSC so I'll lose image quality above 98Hz, there are many games that will run at 120-144Hz with the 3090 at 2.1GHz. Probably half of all modern games right now will hit the 120-144fps range with an overclocked 3090 under water. Add in DLSS and that number would jump to 75% of games (that support DLSS). Even an OC'd 3080 has a chance of getting to or at lease close to 120-144fps on some engines/games like doom/wolfenstein/bf5/deaths stranding/dmc/witcher 3/etc...

It's really not bad.


----------



## Vapochilled

As someone already said, this is looking like a pissing thread. I just got to know titan rtx is faster than 3080. Must be a new game made by a 2080 ti owner.

Just to clarify .. the name of the thread is:
3080 owners. So unless you have one, stop hijacking this thread with comments that make no sense.


Now the real talk.
I have a 3080 gigabyte eagle Oc. Paid 720euros for it. The cheapest option in Europe.

I was getting crash in some games.
Changed my msi AB voltage curve and I have no crash at all

I have something like
[email protected]
[email protected]
[email protected]

No more above than this.
It's working great


----------



## pewpewlazer

Vapochilled said:


> Now the real talk.
> I have a 3080 gigabyte eagle Oc. Paid 720euros for it. The cheapest option in Europe.
> 
> I was getting crash in some games.
> Changed my msi AB voltage curve and I have no crash at all
> 
> I have something like
> [email protected]
> [email protected]
> [email protected]
> 
> No more above than this.
> It's working great


Was it crashing in games at stock? Or was it overclocked with an offset? Was it hitting the magical 2ghz mark when it was crashing?


----------



## Vapochilled

pewpewlazer said:


> Was it crashing in games at stock? Or was it overclocked with an offset? Was it hitting the magical 2ghz mark when it was crashing?


Never tested the default.....

I had a custom offset curve. It's was hitting 2030mhz or more depending the power draw and the game. I play at 5k ultra wide, so my power draw is higher.

Curious is: all bench's would go ok.
But in real gaming, because the scenario changes a lot, explosions, buildings, etc , the game would crash.

Changing the curve to avoid 2000mhz, I have no crash.
It's doing 1980 and 1995 stable no matter the game or bench (including firestrike ultra). I am using only 0.975.. 1v.. 0.956 for 1960mhz.

That's avoiding constantly power limit hits. I think that's the problem igorlabs stated. Those crap capacitors can deliver instant power changes 

I am happy with these results for now


----------



## Nizzen

keng said:


> csgo
> valorant
> flightsim 2020
> It was a tie in Google Chrome dinosaur run


What cpu and memory do you have? Maybe there is some cpu/memory bandwidth bound scenario here?

Please post settings used and youre hardware, so it's possible to compare


----------



## keng

Nizzen said:


> What cpu and memory do you have? Maybe there is some cpu/memory bandwidth bound scenario here?
> 
> Please post settings used and youre hardware, so it's possible to compare


no, its a nonissue as people here do not have titan rtxs.
Also, people are not smart enough to realize that gpu cores designed for ML inference are not going to be good for pew pew. 9k cuda cores, 10gigs of ram... how can it compare with 24gigs of vram and 5k cuda cores? Thinking is not a strongsuit F5-hypeboys.



UserBenchmark: Nvidia RTX 3080 vs Titan


----------



## Nizzen

Userbenchmark? 

Why not answer my questions?


----------



## Nizzen

keng said:


> no, its a nonissue as people here do not have titan rtxs.
> Also, people are not smart enough to realize that gpu cores designed for ML inference are not going to be good for pew pew. 9k cuda cores, 10gigs of ram... how can it compare with 24gigs of vram and 5k cuda cores? Thinking is not a strongsuit F5-hypeboys.
> 
> 
> 
> UserBenchmark: Nvidia RTX 3080 vs Titan


----------



## VoRtAn

For the people having issues, *have you ever tried nvidia 460.20 drivers ??
https://download1590.mediafire.com/..._GRD_Win10-DCH_x64_WHQL-DEV-International.exetry and let me know feedback *

OC'ed, 45min ingame, 1.1v max , trying to see a crash hehe with no luck ... boosting 2070, when temp rises 2055 solid rock, nvidia 460.20, fan locked 60%, MSI TRIO version.



*edit below, voltage + 100 / core + 80 / mem + 850*
I give up, won't crash







, if temps below 62ºC, boost won't go lower 2115 mhz







, aircooled 
Must be great watercooling this cards, even with the modest default power limit... still got 40% fan left


----------



## keng

HyperMatrix said:


> Even that's not entirely true. Although I'm personally targeting 98Hz right now, because the monitor doesn't support DSC so I'll lose image quality above 98Hz, there are many games that will run at 120-144Hz with the 3090 at 2.1GHz. Probably half of all modern games right now will hit the 120-144fps range with an overclocked 3090 under water. Add in DLSS and that number would jump to 75% of games (that support DLSS). Even an OC'd 3080 has a chance of getting to or at lease close to 120-144fps on some engines/games like doom/wolfenstein/bf5/deaths stranding/dmc/witcher 3/etc...
> 
> It's really not bad.


you will not overclock any 3080 or 3090 above 2ghz to my knowledge* without crashing games.
it has to do with [email protected]#%^@5 RF shileding resulting in way too much spiking during changes in power loads.

The current state of power delivery on 30 series cards is akin to a grade school assembly teacher causing increasing amounts of microphone feedback wondering if her telephone is causing it.


----------



## keng

Shilly shill shill.

Linus didn't report the games crashing. Watch how hands are washed

You can watch Linus's Adam's apple go up and down as he does "hard swallows" deceiving you. So yes, shilly shill.

Anybody with an IQ of a raspberry would figure out that if the PSU was generating dips or noise in the line, it would have been an issue for all the GPUs tested, as the transient pull of power is very high on almost all 20 series TIs


----------



## finalheaven

keng said:


> Yes, there 600+ scores not just 1 guy.
> in case you haven't figured out,
> 
> I AM BITTER @#$%@
> 
> For having wasted time, buying this nonsense.
> 
> I am BITTER that Linuxs tech tips, gamers nexus and all other little @%@% DIDNT say that the CARDS are crashing in GAMES
> 
> THEY KNEW.
> 
> THEY KNEW this is @#$%#$%^#$ crap on a stick.
> 
> The flip was nice though


Even if you keep the clocks at stock speeds and disable all boosts, do the cards crash? Let's go with FE since the other cards seem to be underbuilt.

And if it is true that they do not crash, it is still much better value for games than Titan RTX, 2080ti, and anything else out there for 1440p and 4k gaming. 3080 is a good purchase for games.


----------



## keng

finalheaven said:


> Even if you keep the clocks at stock speeds and disable all boosts, do the cards crash? Let's go with FE since the other cards seem to be underbuilt.
> 
> And if it is true that they do not crash, it is still much better value for games than Titan RTX, 2080ti, and anything else out there for 1440p and 4k gaming. 3080 is a good purchase for games.


Titan RTX is not for gaming, although it does it well. The thing is 3080 is not working properly. The one I had at least. Boost is handled by the firmware/bios/drivers. 
I have never liked BOOST as it created software binning of identical GPUS, essentially idiot tax


----------



## finalheaven

keng said:


> Titan RTX is not for gaming, although it does it well. The thing is 3080 is not working properly. The one I had at least. Boost is handled by the firmware/bios/drivers.
> I have never liked BOOST as it created software binning of identical GPUS, essentially idiot tax


Ah, got it. Well I only intend to go for the FE edition to upgrade from my 1070 for purely gaming purposes at 1440p high refresh rates (144+). To date, the best card we can get (value + performance) for gaming seems to be 3080, and I assume you agree on that front? At least until AMD shows their GPUs and see what they can do.


----------



## keng

finalheaven said:


> Ah, got it. Well I only intend to go for the FE edition to upgrade from my 1070 for purely gaming purposes at 1440p high refresh rates (144+). To date, the best card we can get (value + performance) for gaming seems to be 3080, and I assume you agree on that front? At least until AMD shows their GPUs and see what they can do.


Yeah, teh FE card is fine. It is a beefy piece of chip, the deception surrounding this launch is unfortunately very BITTER and not at all sweet


----------



## delreylover

VPII said:


> After watching the Jayztwocents video regarding this 3080 crashing issue. I wanted to check the Palit RTX 3080 GamingPro OC to see what they have at the back. Unfortunately the back plate is completely covering the GPU and as such had to be removed. I had to leave in one screw with a sticker on it but was not too much of an issue. I found that this card does in fact have on capacitor array of 10 of the smaller more expensive capacitors so all good it seem. Probably why I am able to clock the card up to 2145 to 2175mhz. Which does not help except increase avarage clocks.


I also have a Palit RTX 3080 GamingPro. However, my card always hits power limit around 102% TDP, even when the power limit slider is at 109%. I cant get it to boost any higher than 1830 in RDR2. Any idea why?


----------



## HyperMatrix

So is Keng an alt of Mooncheese? What is a non-owner doing spamming negative comments about the 3080 in the owners thread?


----------



## VoRtAn

Boost depends on 3 values, tdp, voltage and temperature.
If you raise tdp and voltage for example and temperature is high, you won't see much or any boost, depending on temperature.
What is these 3 values in RDR2 when hitting 1830 ?
Gpu-Z sensors in game would also help understanding the reason.


----------



## delreylover

VoRtAn said:


> Boost depends on 3 values, tdp, voltage and temperature.
> If you raise tdp and voltage for example and temperature is high, you won't see much or any boost, depending on temperature.
> What is these 3 values in RDR2 when hitting 1830 ?
> Gpu-Z sensors in game would also help understanding the reason.


The TDP is set to max via afterburner, the max this card can go is 109%
Voltage, well i tried undervolting to have [email protected], though it still hits the damned power limit...
temperatures are usually around 72c, though i boosted the fans to 100% and used the card at 61c for an hour. changes almost nothing. power limited
im now outside of my house, I'll post gpu-z screenshots when I'm home
thanks anyways


----------



## pewpewlazer

HyperMatrix said:


> So is Keng an alt of Mooncheese? What is a non-owner doing spamming negative comments about the 3080 in the owners thread?


I think his replies are far too short for them to be the same person


----------



## VPII

delreylover said:


> I also have a Palit RTX 3080 GamingPro. However, my card always hits power limit around 102% TDP, even when the power limit slider is at 109%. I cant get it to boost any higher than 1830 in RDR2. Any idea why?


Mine is the same 102% even though it is set to 109%, not sure why. What I have done is to run Metro Exodus benchmark at normal and 1080p as then it would barely go over 320watts during the run. My reason for doing this was primarily to check boost clocks during the run to see if it has stability issues when running over 2000mhz. Well I am happy to say that during these runs my clocks are basically 2025mhz at its lowest and 2085mhz max and 2055 almost the entire run so the card is fine. I have also increased the clocks by 30mhz and the stated clocks speeds during the run is basically 30mhz higher and it passes.


----------



## bungusbeefcake

The best most of us can do so far...

__ https://twitter.com/i/web/status/1310546062541758466


----------



## HyperMatrix

Anyone see this? Lol.


----------



## Avacado

We often sight "New adopter tax" for a reason. If you are holding a 10 series card and have waited this long to upgrade, why would you rush it?


----------



## DarthBaggins

After seeing all the fallout of this launch, much like the 20-series launch, is making me want to hold out for a little longer to upgrade my 1080Ti. I know I will be aiming towards an ASUS or EVGA card for this gen, and more than likely there will be a Super update from what we saw last gen w/ AMD's launch (other than I am excited to see what they bring to the table for this new gen of theirs).


----------



## Vapochilled

VoRtAn said:


> For the people having issues, *have you ever tried nvidia 460.20 drivers ??
> https://download1590.mediafire.com/..._GRD_Win10-DCH_x64_WHQL-DEV-International.exetry and let me know feedback *
> 
> OC'ed, 45min ingame, 1.1v max , trying to see a crash hehe with no luck ... boosting 2070, when temp rises 2055 solid rock, nvidia 460.20, fan locked 60%, MSI TRIO version.
> 
> 
> 
> *edit below, voltage + 100 / core + 80 / mem + 850*
> I give up, won't crash
> 
> 
> 
> 
> 
> 
> 
> , if temps below 62ºC, boost won't go lower 2115 mhz
> 
> 
> 
> 
> 
> 
> 
> , aircooled
> Must be great watercooling this cards, even with the modest default power limit... still got 40% fan left



TUGA POWER !  here also with 3080 Gigabyte Eagle OC

Just noticed new oficial drivers were release today.



Version:  456.55 *WHQL*  Release Date:  2020.9.28  Operating System:  Windows 10 64-bit  Language:  English (US)  File Size:  609.92 MB 


Those 460 version are delevoper testing I think i will try these 456.55 before those


----------



## ZealotKi11er

VoRtAn said:


> Boost depends on 3 values, tdp, voltage and temperature.
> If you raise tdp and voltage for example and temperature is high, you won't see much or any boost, depending on temperature.
> What is these 3 values in RDR2 when hitting 1830 ?
> Gpu-Z sensors in game would also help understanding the reason.


What resolution are you playing?


----------



## Shadowdane

delreylover said:


> The TDP is set to max via afterburner, the max this card can go is 109%
> Voltage, well i tried undervolting to have [email protected], though it still hits the damned power limit...
> temperatures are usually around 72c, though i boosted the fans to 100% and used the card at 61c for an hour. changes almost nothing. power limited
> im now outside of my house, I'll post gpu-z screenshots when I'm home
> thanks anyways


Do you have the latest beta of Afterburner?? It added support for Ampere GPUs. If you have an older version it might not be reporting Power Limit correctly.









MSI Afterburner 4.6.5 (Beta2) Download


MSI Afterburner 4.6.2 Download - Today we release an updated this Stable revision of Afterburner, this application successfully secured the leading position on graphics card utilities.




www.guru3d.com


----------



## maTyaR

Has anyone tried the new drivers yet? Any changes to the GPU boost/voltage? 


I have the Asus Tuf 3080 non OC, n it does CTD if I were to OC (core n mem), but at default, it's fine. 

Someone used another driver and was able to get stable OC clocks (can't tell if person is just boasting), so that could be another issue.


----------



## VoRtAn

I was using 460.20 drivers (quadro version) no issue.
Tried now the new "official" version (NVIDIA GeForce Game Ready 456.55 WHQL ), same, works flawlessly...boost, oc, u name it


----------



## Vapochilled

VoRtAn said:


> I was using 460.20 drivers (quadro version) no issue.
> Tried now the new "official" version (NVIDIA GeForce Game Ready 456.55 WHQL ), same, works flawlessly...boost, oc, u name it


I already posted about this before .. but here it goes again.
You're boosting way higher than me because your TDP is only 260W for those 2100Mhz. 
Thats maybe because your game is not that intensive / your resolution is way lower than mine.

At 1950Mhz - AND WITH UNDERVOLT, i have my consumption in 340W ! Way higher than you.
Thats because i play at 5K with everything in ultra. 
If i play at 2560x1440p for exemple… my TDP decreases bellow 300W.....
You can try this by rendering above your resolution - scale it above 100% and you will notice the difference … and maybe power problems


----------



## VoRtAn

That's normal if you raise resolution, it's not ampere, happens with all graphics

rendered in game 4k, boost jumps between 2080 and 2040, no more 2100+ boost haha, again, no stability, crash, problem whatsoever, *but now, like print shows, perfcap hits hard *


----------



## maTyaR

VoRtAn said:


> I was using 460.20 drivers (quadro version) no issue.
> Tried now the new "official" version (NVIDIA GeForce Game Ready 456.55 WHQL ), same, works flawlessly...boost, oc, u name it


Your Gpu usage is low (95+% is where it should be for testing). Also, I think a much longer test would prove even further in terms of stability. Your chart is only showing about 5-10mins of game time.


----------



## VoRtAn

tried that all night with 460.20, no issues and no crash, the ideia of this quick tests is to see max boost, because the reports of crashing it was always related with high boost and as you all may be aware, if temp rises, boost is lower.
It was good that night for the cats ... little bit more warm the office


----------



## maTyaR

Probably a driver update would resolve whatever the problem is right now for many people then.


----------



## Vapochilled

VoRtAn said:


> That's normal if you raise resolution, it's not ampere, happens with all graphics
> 
> rendered in game 4k, boost jumps between 2080 and 2040, no more 2100+ boost haha, again, no stability, crash, problem whatsoever, *but now, like print shows, perfcap hits hard *




Thats more like it!!!!!  Still, great result for 340W TDP


----------



## shallow_

maTyaR said:


> Has anyone tried the new drivers yet? Any changes to the GPU boost/voltage?
> 
> 
> I have the Asus Tuf 3080 non OC, n it does CTD if I were to OC (core n mem), but at default, it's fine.
> 
> Someone used another driver and was able to get stable OC clocks (can't tell if person is just boasting), so that could be another issue.


just watched this review of the asus strix 3090, and he too concluded with driver issues..


----------



## acoustic

Haha, this guy is a serious troll. Your examples of games faster on a TITAN RTX vs a 3080 are three extremely CPU limited games.

Someone get this guy a clue.


----------



## maTyaR

shallow_ said:


> just watched this review of the asus strix 3090, and he too concluded with driver issues..


Nice video, and what he claims is another possibility. Considering he can run 2100mh at ease on Linux, VS 2050mhz on Windows and crashes to desktop, I'd say it's likely driver issues.


----------



## Nizzen

Here IS the video to watch of the "problem"






Feels bad about the guy that sold his "faulty" 3080


----------



## cstkl1

watch this idiot

1. doesnt know that pll increase clock cycle of ampere same as turing.. 15mhz
2. doesnt know osd rivatuner handles the font and sizing..

simple **** everybody knows

but ya we are suppose to listen to his 4k-8k evalution its a "software bug"

so is he trying to say EVGA r&D is stupid?? they recreated the issue and resolved it with a fix .. MLCC. they didnt take igors word. they found the issue a week ago.


----------



## DFroN

Today’s driver update fixed the crashing for me and many others on the Nvidia reddit. Before I had to downclock to stop the card boosting to 2GHz, now I can overclock so that the card never dips below 2GHz while gaming and it’s completely stable  I haven’t messed around with max OC yet but 2050Mhz seems ok so far.

My observation is that the clock and voltage are much more stable with the new driver, the old driver used to shoot around the boost table aggressively for seemingly no reason (perhaps power saving?).


----------



## VoRtAn

DFroN said:


> Today’s driver update fixed the crashing for me and many others on the Nvidia reddit. Before I had to downclock to stop the card boosting to 2GHz, now I can overclock so that the card never dips below 2GHz while gaming and it’s completely stable  I haven’t messed around with max OC yet but 2050Mhz seems ok so far.
> 
> My observation is that the clock and voltage are much more stable with the new driver, the old driver used to shoot around the boost table aggressively for seemingly no reason (perhaps power saving?).


I'm saying that since 25th in forums, tested 4x vga's before (trio, tuf you name it) and all of them with reported online issues/crashes were fixed with 460.20 quadro version as simples as new driver install, not even DDU was needed, and now with new driver also, not saying that 100% of 3080's is fixed with driver version, but 100% that i tested were "fixed" like that because those vga's were never "broken".
It's always easier jump into the hype train of youtubers and techchannels 🚂...


----------



## maTyaR

I can confirm the game runs stable now past 2Ghz. I can run stable now at 2145Mhz core clock and +900Mhz before ECM begins to activate.


----------



## VoRtAn

Sorry for the quality, not my plans being youtuber, no light and with smartphone recording hehe... "issue installed , tested and repaired" in 10min







, better quality uploading !


----------



## Alemancio

Do you know if something happened with restocking of 3080? it's been 3 days since last time main retailers restocked their inventory...


----------



## EarlZ

ZealotKi11er said:


> Just to correct this. Nvidia does not have error correct memory. Also it not part of GDDR6X. What they have is Error Detection and Replay. This is part of their memory controller. This allows for the memory controller to replay the bad data until it gets it right[since you are basically stuck on this until good data you lose performance,]. Thre is a limit which is most likely 3-5 tries and after that you hang/reset (CTD). ECC is completely different.


What would be the best way/app/static scene to test for this if we are still getting performance as we increase the clocks


----------



## HyperMatrix

EarlZ said:


> What would be the best way/app/static scene to test for this if we are still getting performance as we increase the clocks


MSI Kombustor has a Pause feature which keeps the camera locked in place and there are no moving objects. But your best bet is to just run a full bench between tests using the port royal benchmark.


----------



## EarlZ

HyperMatrix said:


> MSI Kombustor has a Pause feature which keeps the camera locked in place and there are no moving objects. But your best bet is to just run a full bench between tests using the port royal benchmark.


Probably just lock the fan speed to 100% to eliminate the GPU throttle and isolate memory gains ?


----------



## HyperMatrix

EarlZ said:


> Probably just lock the fan speed to 100% to eliminate the GPU throttle and isolate memory gains ?


Yeah I saw Nexus Gamer's results when he was doing his hours long benching stream. At one point he decided to lower GPU clock and instead push up mem clocks a bit and got even higher performance. But yeah full fans, and full port royal runs would give you the best indication. A static scene may not be the best for testing memory performance.


----------



## cstkl1

have 4k hours on this game

so tested new drivers.. noticed gpu clock was high, power less and so is frames..

had already 40 hours in this game with this card. 

lets test
New








vs Older










Spoiler






















Shame on you nvidia. Just to rescue Zotac.. you nerfed those who are not affected as well. zotac wont do a recall.

whats shocking is how many dumbass influencer and youtuber on aib payroll spouting nonsense.
are you trying to tell me evga R&D lied.


----------



## cstkl1

HyperMatrix said:


> Yeah I saw Nexus Gamer's results when he was doing his hours long benching stream. At one point he decided to lower GPU clock and instead push up mem clocks a bit and got even higher performance. But yeah full fans, and full port royal runs would give you the best indication. A static scene may not be the best for testing memory performance.


steve oc skills is akin to a 5 year old. dont trust anything u see


----------



## cstkl1

Nizzen said:


>


ignore him

for vs titan only dude i trust is thirtyIR @Baasha


----------



## MikeGR7

cstkl1 said:


> have 4k hours on this game
> 
> so tested new drivers.. noticed gpu clock was high, power less and so is frames..
> 
> had already 40 hours in this game with this card.
> 
> lets test
> New
> 
> 
> 
> 
> 
> 
> 
> 
> vs Older
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Shame on you nvidia. Just to rescue Zotac.. you nerfed those who are not affected as well. zotac wont do a recall.
> 
> whats shocking is how many dumbass influencer and youtuber on aib payroll spouting nonsense.
> are you trying to tell me evga R&D lied.


Are you sure that it's under the same conditions?
For example i don't know what game is this but it looks like the lighting is different.
Also seems to me the old test had the power limits maxed and the new has it at default.


----------



## Talon2016

cstkl1 said:


> have 4k hours on this game
> 
> so tested new drivers.. noticed gpu clock was high, power less and so is frames..
> 
> had already 40 hours in this game with this card.
> 
> lets test
> New
> 
> 
> 
> 
> 
> 
> 
> 
> vs Older
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Shame on you nvidia. Just to rescue Zotac.. you nerfed those who are not affected as well. zotac wont do a recall.
> 
> whats shocking is how many dumbass influencer and youtuber on aib payroll spouting nonsense.
> are you trying to tell me evga R&D lied.


You're at 100% power and 103% power in another picture? At least I assume thats power limit.. I'll also go with the guy above that the lighting difference could easily explain some performance.


----------



## cstkl1

MikeGR7 said:


> Are you sure that it's under the same conditions?
> For example i don't know what game is this but it looks like the lighting is different.
> Also seems to me the old test had the power limits maxed and the new has it at default.


already told u .. it does effect the FPS.. that conditions are the same.. i can load in the game anytime when the day and night changes ..

this was literally few minutes apart.

i know how this game fps works.. and as i said i have 40k hours on this card ONLY on this game..

i only ran this after two games noticed strange behaviour on the fps.. 
so best way to test is in the keep at that spot.. other areas a lot of cpu bound stuff

the fps tanked with the latest drivers.


----------



## cstkl1

Talon2016 said:


> You're at 100% power and 103% power in another picture? At least I assume thats power limit.. I'll also go with the guy above that the lighting difference could easily explain some performance.


power limit is at 117%..

i mean if everybody gonna clap on nvidia for this and allow zotac to get away with it while being nerfed on fps.. go ahead..

i can only tell what i see.


----------



## Talon2016

cstkl1 said:


> power limit is at 117%..
> 
> i mean if everybody gonna clap on nvidia for this and allow zotac to get away with it while being nerfed on fps.. go ahead..
> 
> i can only tell what i see.


O trust me I'm not defending anyone if that is actually happening, I'm just not experiencing a downclock or performance loss. It's possible but I'll have to go do more testing. Will let you know if I find anything.


----------



## cstkl1

Talon2016 said:


> O trust me I'm not defending anyone if that is actually happening, I'm just not experiencing a downclock or performance loss. It's possible but I'll have to go do more testing. Will let you know if I find anything.


i only know on this game theres a drop in fps while tdp is lower and the card is sustaining high clock boost longer. 

its the only game i play. 

hope your feedback is based on games your play. i dont trust benchmarks as nvidia etc knows how to optimize those.


----------



## Joeking78

Returning member.

MSI 3080 Gaming Trio arrives in a few hours, upgrading from a 780GTX TI which died after 5 years service last week.


----------



## PraiseKek

Joeking78 said:


> Returning member.
> 
> MSI 3080 Gaming Trio arrives in a few hours, upgrading from a 780GTX TI which died after 5 years service last week.


enjoy man..gonna be tasty


----------



## Joeking78

Cheers.

Not sure if I'll experience all these crashes I see mentioned online, I ordered over a week ago before the news started coming out and the card got stuck in customs here.

Will be nice to finally play some top level games after using a single 780 for 6 months, slowly went from 3-way SLI, 2-way SLI and down to single GPU as cards slowly declined/died over the years...this 3080 should be a good experience and better than anything I've used before


----------



## VPII

I'll be honest, the best way to see whether you get crashes is to run game benchmarks at 1080P medium settings or so. This would mean the load to the gpu would be lower or just over 320watt therefor you'll get more constant gpu clocks over 2000mhz. My Palit RTX 3080 GamingPro OC running the Shadow of Tomb Raider benchmark would be constantly above 2000mhz, 2010mhz the lowest but peak at 2085, basically going 2085mhz right through the entire final scene.


----------



## cstkl1

Joeking78 said:


> Cheers.
> 
> Not sure if I'll experience all these crashes I see mentioned online, I ordered over a week ago before the news started coming out and the card got stuck in customs here.
> 
> Will be nice to finally play some top level games after using a single 780 for 6 months, slowly went from 3-way SLI, 2-way SLI and down to single GPU as cards slowly declined/died over the years...this 3080 should be a good experience and better than anything I've used before


was rocking a 780 while waiting for rtx 3080

on v2.. low [email protected] it could hardly handly 60fps.

so 3080 is arnd 4-5 times that vmin 25 vs 130-150


----------



## cstkl1

VPII said:


> I'll be honest, the best way to see whether you get crashes is to run game benchmarks at 1080P medium settings or so. This would mean the load to the gpu would be lower or just over 320watt therefor you'll get more constant gpu clocks over 2000mhz. My Palit RTX 3080 GamingPro OC running the Shadow of Tomb Raider benchmark would be constantly above 2000mhz, 2010mhz the lowest but peak at 2085, basically going 2085mhz right through the entire final scene.


dlls bro like metro. because its running at way lower render resolution.. the gpu clocks will be high. thats y metro exodus.. normally will find those problematic gpu

another note. theres a user he bought a giga 3090.. was crashing on his ryzen 5. forza.. he also ordered a tuf setup with intel 10900k tuf mobo etc and a 3080 tuf. he switched the gpu to that. no more crashes. 3090 giga stable. the previous driver

interesting eh. no oc on anything.


----------



## VPII

cstkl1 said:


> dlls bro like metro. because its running at way lower render resolution.. the gpu clocks will be high. thats y metro exodus.. normally will find those problematic gpu
> 
> another note. theres a user he bought a giga 3090.. was crashing on his ryzen 5. forza.. he also ordered a tuf setup with intel 10900k tuf mobo etc and a 3080 tuf. he switched the gpu to that. no more crashes. 3090 giga stable. the previous driver
> 
> interesting eh. no oc on anything.


Look my Palit was stable even with the first drivers, no issues. What I have done is to play Metro Exodus at 1080P so the card can run above 2000mhz core basically all the time. It would sit at 2070mhz most of the time and no issues, crashes or anything. I do know that my card do actually have on array of 10 mlcc caps so that might be part of the reason but from what I saw in a youtube things of these cards is that the Palit has the 10 mlcc caps but the other 5 larger caps are actually not that great or the uF of the caps is maybe too low.


----------



## Bercon

Does anybody have 3080 TUF *non*-OC model, what are the powerlimits on that card, are they different from OC model?

EDIT: Seems like it was already answered here, yes they have same power limits [Official] NVIDIA RTX 3080 Owner's Club


----------



## delreylover

Can we mod the BIOS or replace it with another AIB BIOS to get higher power limits? As far as I can remember, this was possible on 2080Ti. I'm currently rocking a Palit RTX 3080 but power limits are causing my card to have low clocks even at low temperatures on certain games like RDR2...


----------



## VPII

delreylover said:


> Can we mod the BIOS or replace it with another AIB BIOS to get higher power limits? As far as I can remember, this was possible on 2080Ti. I'm currently rocking a Palit RTX 3080 but power limits are causing my card to have low clocks even at low temperatures on certain games like RDR2...


You'll be stuck with that for now as no nvflash at present that recognise the RTX 3000 series. The other issue as well for me with my Palit is even though you can increase the power limit to +9% for 350watt it still downclocks the moment it hits over 320watt.


----------



## delreylover

VPII said:


> You'll be stuck with that for now as no nvflash at present that recognise the RTX 3000 series. The other issue as well for me with my Palit is even though you can increase the power limit to +9% for 350watt it still downclocks the moment it hits over 320watt.


Thanks for the information and yes, exactly. It hits 320 and downclocks immediately. I emailed Palit but got no reply yet. I think this card does not respect the power limit setting. I'm not sure what can cause this but maybe their BIOS is buggy. They need to release an update with increased power limits and with this issue fixed. They did increased power limits on Palit 2080Ti some time after the initial release, lets hope they do the same thing to us.


----------



## HyperMatrix

Interesting test regarding undervolting. Almost no performance loss, while dropping TDP to 200W. Let's hope the miners don't see this. Haha.









Undervolting Ampere, GeForce RTX 3080 Hidden Efficiency Potential?


The GeForce RTX 3080 has rocked the world with the fastest flagship performance to date, but can we tame it with undervolting?




wccftech.com


----------



## Arni90

HyperMatrix said:


> Interesting test regarding undervolting. Almost no performance loss, while dropping TDP to 200W. Let's hope the miners don't see this. Haha.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Undervolting Ampere, GeForce RTX 3080 Hidden Efficiency Potential?
> 
> 
> The GeForce RTX 3080 has rocked the world with the fastest flagship performance to date, but can we tame it with undervolting?
> 
> 
> 
> 
> wccftech.com


Interesting, but not at all surprising.


----------



## doom26464

Wa


Alemancio said:


> Do you know if something happened with restocking of 3080? it's been 3 days since last time main retailers restocked their inventory...


Since this comment will get ignored by the 3 people in here who have cards talking back and forth, ill give you an answer. 

There probaly be no stock this week as from what I heard its golden week in china so that should probaly add to an already delayed launch.


----------



## VPII

delreylover said:


> Thanks for the information and yes, exactly. It hits 320 and downclocks immediately. I emailed Palit but got no reply yet. I think this card does not respect the power limit setting. I'm not sure what can cause this but maybe their BIOS is buggy. They need to release an update with increased power limits and with this issue fixed. They did increased power limits on Palit 2080Ti some time after the initial release, lets hope they do the same thing to us.


What I found interesting, now with the latest driver from Nvidia on their website the 456.55 I think running Time Spy I saw power was above 320watt most of the time and my clocks drop into the 1900mhz range immediately. So I started to overclock my card and run Time Spy. I got up to +165mhz on the core and it passed as with the result below. I mean 19000 gpu score on stock air cooling.



https://www.3dmark.com/spy/14223059


----------



## zhrooms

*Download NVIDIA NVFlash 5.660.0 Ampere*
*Password:* Ampere

NVFlash leaked/released early by Inno3D


----------



## VPII

zhrooms said:


> *Download NVIDIA NVFlash 5.660.0 Ampere
> Password:* Ampere
> 
> NVFlash leaked/released early by Inno3D


My friend, you are a star


----------



## AlKappaccino

zhrooms said:


> *Download NVIDIA NVFlash 5.660.0 Ampere
> Password:* Ampere
> 
> NVFlash leaked/released early by Inno3D


Very nice! Good find.


----------



## acoustic

zhrooms said:


> *Download NVIDIA NVFlash 5.660.0 Ampere
> Password:* Ampere
> 
> NVFlash leaked/released early by Inno3D


What a hero


----------



## Talon2016

Flashed my 3080 XC3 with FTW3 vBIOS, no go. Card reports false power pull, clocks are lower around 1700Mhz, cannot really get them to increase.


----------



## eeroo94

Talon2016 said:


> Flashed my 3080 XC3 with FTW3 vBIOS, no go. Card reports false power pull, clocks are lower around 1700Mhz, cannot really get them to increase.


FTW3 is 3x 8 pin card and XC3 is 2x 8pin. TUF seems to be the highest power limit card with 2x 8pin


----------



## VPII

eeroo94 said:


> FTW3 is 3x 8 pin card and XC3 is 2x 8pin. TUF seems to be the highest power limit card with 2x 8pin


I was able to flash my Palit with the FTW3 Ultra bios, however upon starting benchmarks usually pulling 320 to 330 watt it immediately pulled 400watt and clocks very low. So I flashed it back to stock. Not sure why?


----------



## eeroo94

VPII said:


> I was able to flash my Palit with the FTW3 Ultra bios, however upon starting benchmarks usually pulling 320 to 330 watt it immediately pulled 400watt and clocks very low. So I flashed it back to stock. Not sure why?


3x 8pin card bioses don't work properly with 2x 8pin cards.


----------



## VPII

eeroo94 said:


> 3x 8pin card bioses don't work properly with 2x 8pin cards.


Yes, I cannot argue that, but it still does not explain why 320 to 330 watt becomes 400 watt litterally instantly.


----------



## Talon2016

Anyone out there with the Asus TUF vBIOS care to use Ampere Nvflash to make a backup of their vBIOS and upload for us.


----------



## shALKE

Where can I get a reference bios ?


----------



## Talon2016

shALKE said:


> Where can I get a reference bios ?


Tech Power Up has the reference 3080 vBIOS hosted.


----------



## shALKE

Talon2016 said:


> Tech Power Up has the reference 3080 vBIOS hosted.


I can only see the FE (that is not reference) and the FTW3 Ultra (with 3x pin connector)


----------



## delreylover

Can someone please backup and upload Asus TUF BIOS please?


----------



## KingEngineRevUp

Nvidia new GPU drivers enhances the performance of OC headroom for my RTX 3080 Gigabyte OC.











https://www.3dmark.com/compare/fs/23626808/fs/23616711

https://www.3dmark.com/compare/spy/14231371/spy/14011322

https://www.3dmark.com/compare/pr/348656/pr/312829


----------



## Pet_gz

delreylover said:


> Can someone please backup and upload Asus TUF BIOS please?


https://fil.email/02Fu498Q

Asus TUF 3080 Bios ( No OC Version).












Please, Asus TUF 3080 *OC Version* Bios???


----------



## Nizzen

Pet_gz said:


> Download files - Filemail
> 
> Asus TUF 3080 Bios ( No OC Version).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please, Asus TUF 3080 *OC Version* Bios???


I'm trying to download my Palit 3080 oc bios. It says device is not supported?


----------



## Pet_gz

Nizzen said:


> I'm trying to download my Palit 3080 oc bios. It says device is not supported?


With *NVFlash 5.660.0 Ampere* , in the first post ( Flash Guide - Bios Backup):









[Official] NVIDIA RTX 3080 Owner's Club


Last Updated: November 13, 2020 Note: This content is licensed under Creative Commons 3.0. This means that you are free to copy and redistribute this material, but only if the following criteria are met: 1) You must give appropriate credit by linking back to this thread. 2) You may not use this...




www.overclock.net


----------



## Nizzen

Pet_gz said:


> With *NVFlash 5.660.0 Ampere* , in the first post ( Flash Guide - Bios Backup):
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [Official] NVIDIA RTX 3080 Owner's Club
> 
> 
> Last Updated: November 13, 2020 Note: This content is licensed under Creative Commons 3.0. This means that you are free to copy and redistribute this material, but only if the following criteria are met: 1) You must give appropriate credit by linking back to this thread. 2) You may not use this...
> 
> 
> 
> 
> www.overclock.net


I mean extract bios from my Palit 3080. Make a backup bios. Sorry I'm noob


----------



## Pet_gz

Nizzen said:


> I mean extract bios from my Palit 3080. Make a backup bios. Sorry I'm noob


Check the flash guide, Gpuz has no support in the last version.


----------



## Talon2016

Nizzen said:


> I mean extract bios from my Palit 3080. Make a backup bios. Sorry I'm noob


Download and extract the ampere nvflash file with 7zip. Password is "Ampere". 

Put Nvflash64 into a folder on C: drive called Nvflash.

Copy the directory to that folder, then go to search and search for "cmd". Right click open as admin.

Once in command window type cd (space) right click paste that copied directory.

Now you should be in the Nvflash folder directory.

Type Nvflash64 -b palit3080.rom (enter)

It will flicker your display off then on as it disables the device for a second to save the backup rom. The rom will save to the Nvflash folder.

Please upload the .rom file here so we can all test. Thanks!


----------



## Talon2016

Just tested the Asus TUF vbios on the XC3 and while it flashses fine you are still limited to 320w~. What is the highest TDP reference board?


----------



## Pet_gz

Talon2016 said:


> Just tested the Asus TUF vbios on the XC3 and while it flashses fine you are still limited to 320w~. What is the highest TDP reference board?


Mine??

Gpuz says Maximum 375w, my maximum is 360w with +141 mhz core ( stock voltage).

Previous drivers:










Highest TDP board is Asus Strix OC ( 3x8pin)


----------



## HyperMatrix

*deleted. forgot this is 3080 thread. sorry.


----------



## delreylover

Talon2016 said:


> Download and extract the ampere nvflash file with 7zip. Password is "Ampere".
> 
> Put Nvflash64 into a folder on C: drive called Nvflash.
> 
> Copy the directory to that folder, then go to search and search for "cmd". Right click open as admin.
> 
> Once in command window type cd (space) right click paste that copied directory.
> 
> Now you should be in the Nvflash folder directory.
> 
> Type Nvflash64 -b palit3080.rom (enter)
> 
> It will flicker your display off then on as it disables the device for a second to save the backup rom. The rom will save to the Nvflash folder.
> 
> Please upload the .rom file here so we can all test. Thanks!


I can upload it but you shouldn't try it. Its a buggy bios. Power limited to 350w but as soon as it reaches 320w, gpu starts throttling. GPU-Z shows limit reason as power. This happens with power limit slider set to 109%...


----------



## Alemancio

doom26464 said:


> Since this comment will get ignored by the 3 people in here who have cards talking back and forth, ill give you an answer.
> There probaly be no stock this week as from what I heard its golden week in china so that should probaly add to an already delayed launch


Thanks!!!


----------



## Alemancio

I think that this launch is incredibly frustrating. It's like Nvidia doesn't want my money. As a customer, it's been the most frustrating I've (we've) been treated ever. Shame on Nvidia and their most idiotic product launch ever, that has us subscribing to stock checkers, really? How ELSE can we get a 3080?...................................................


----------



## Nizzen

Alemancio said:


> I think that this launch is incredibly frustrating. It's like Nvidia doesn't want my money. As a customer, it's been the most frustrating I've (we've) been treated ever. Shame on Nvidia and their most idiotic product launch ever, that has us subscribing to stock checkers, really? How ELSE can we get a 3080?...................................................


MANY people have 3080/3090 now including me. I Guess you ordered a few minutes too late 
Waiting is boring, we know


----------



## pewpewlazer

Alemancio said:


> I think that this launch is incredibly frustrating. It's like Nvidia doesn't want my money. As a customer, it's been the most frustrating I've (we've) been treated ever. Shame on Nvidia and their most idiotic product launch ever, that has us subscribing to stock checkers, really? How ELSE can we get a 3080?...................................................


No different from Turing launch. Or Pascal launch. Or probably the generations before those, but I was too busy drinking my way through college to pay attention.

The fact that we can't just back order cards and the artificial scarcity is a complete joke. I was able to back order a damn VESA mount monitor arm a month ago (which is finally arriving tomorrow), but I can't do that for a graphics card? Why? I wouldn't mind spending $700-800 for a 20% performance boost, but I'm not about to spend even 7-8 minutes stalking the internet for the opportunity to do so.


----------



## VPII

Okay I have flashed my Palit with the FTW3 Ultra bios and for some reasons running benchmarks that at most draw 335watt immediately pulled 400watt. This morning I ended up using the Asus Tuff bios posted earlier, (Thank you) and it behaved very similar to my Palit bios but performance was a little less. The 340watt stock power limit on the Asus Tuf card does not mean a thing for my card as it still would drop clocks the moment it passes the 320watt limit. Yes it would pull up to 330watt but clocks would drop to 1935mhz or there about. SO I flashed my card back with the normal Palit GamingPro OC bios and ran Time Spy again.

Now taken that I know my card can run 2070mhz without an issue, tested with Shadow of Tomb Raider and Metro Exodus playing at 1080P to limit power draw just to keep the clocks high, I decided to see up to where I can take the core clock for running Time Spy as it will draw up to 330watt or more but if my standard clocks are more it would not drop as much during the bench. Well I got it to pass several times with the core speed increased by 165mhz which would basically mean max core speed of 2205mhz but it held an average speed of 1981mhz as per the link below.



https://www.3dmark.com/spy/14243859



I thought it was great taken that it gave me a 19103 GPU score. Look I am no expert at this, but I do believe that the only people that would be able to increase the power limit on a board would be the AIB partners. You cannot flash the card would the FE bios, it just does not work, or I missed something to add to the flashing line.


----------



## shiokarai

VPII said:


> Okay I have flashed my Palit with the FTW3 Ultra bios and for some reasons running benchmarks that at most draw 335watt immediately pulled 400watt. This morning I ended up using the Asus Tuff bios posted earlier, (Thank you) and it behaved very similar to my Palit bios but performance was a little less. The 340watt stock power limit on the Asus Tuf card does not mean a thing for my card as it still would drop clocks the moment it passes the 320watt limit. Yes it would pull up to 330watt but clocks would drop to 1935mhz or there about. SO I flashed my card back with the normal Palit GamingPro OC bios and ran Time Spy again.
> 
> Now taken that I know my card can run 2070mhz without an issue, tested with Shadow of Tomb Raider and Metro Exodus playing at 1080P to limit power draw just to keep the clocks high, I decided to see up to where I can take the core clock for running Time Spy as it will draw up to 330watt or more but if my standard clocks are more it would not drop as much during the bench. Well I got it to pass several times with the core speed increased by 165mhz which would basically mean max core speed of 2205mhz but it held an average speed of 1981mhz as per the link below.
> 
> 
> 
> https://www.3dmark.com/spy/14243859
> 
> 
> 
> I thought it was great taken that it gave me a 19103 GPU score. Look I am no expert at this, but I do believe that the only people that would be able to increase the power limit on a board would be the AIB partners. You cannot flash the card would the FE bios, it just does not work, or I missed something to add to the flashing line.


Honestly, 320w vs 340w will be hardly noticeable, even for the monitoring software. Also flashing 3x8pin bios to 2x8pin card doesn't work, as you know now. Your card water-cooled or still on air? Seems like a good clocks for the air config and 2x8 pin card.


----------



## VPII

shiokarai said:


> Honestly, 320w vs 340w will be hardly noticeable, even for the monitoring software. Also flashing 3x8pin bios to 2x8pin card doesn't work, as you know now. Your card water-cooled or still on air? Seems like a good clocks for the air config and 2x8 pin card.


My card still on stock air cooling, obviously with the fans at 100% while I bench so temps sometimes over but mostly below 50c if I run in the morning with the cooler air temps.


----------



## zalo1196

hi every one this is my asus rtx 3080 tuf oc performance bios enjoy








Download files - Filemail


Click here to view and download these shared files from Filemail.com




fil.email


----------



## VPII

shiokarai said:


> Honestly, 320w vs 340w will be hardly noticeable, even for the monitoring software. Also flashing 3x8pin bios to 2x8pin card doesn't work, as you know now. Your card water-cooled or still on air? Seems like a good clocks for the air config and 2x8 pin card.


Oh and with regards to the 320 and 340watt. If you run 3dmark Time Spy the power usage will never reach 340watt which woud mean your clocks should in fact remain maxed out except for temps while running the benchmark, but that is not the case when using the Asus Tuf bios which has a stock 340watt power limit as the clocks still drop when reaching and passing 320watt.


----------



## VPII

zalo1196 said:


> hi every one this is my asus rtx 3080 tuf oc performance bios enjoy
> 
> 
> 
> 
> 
> 
> 
> 
> Download files - Filemail
> 
> 
> Click here to view and download these shared files from Filemail.com
> 
> 
> 
> 
> fil.email


Thanks a million


----------



## Mucho

VPII said:


> SO I flashed my card back with the normal Palit GamingPro OC bios and ran Time Spy again.


Hi, where did you get the Palit GamingPro OC bios from? I would like to flash my Palit GamingPro NON OC with the OC bios.
Thx


----------



## VPII

Mucho said:


> Hi, where did you get the Palit GamingPro OC bios from? I would like to flash my Palit GamingPro NON OC with the OC bios.
> Thx


It is from my card, not sure how to upload it. Some guidance then I can do it for you


----------



## Mucho

VPII said:


> It is from my card, not sure how to upload it. Some guidance then I can do it for you


You already downloaded the rom via Nvflash?


----------



## VPII

Mucho said:


> You already downloaded the rom via Nvflash?


Yes.... Check quickly if the link works. I used GoFile to upload.

Gofile

Okay link works to here with the Palit GamingPro OC bios


----------



## Mucho

VPII said:


> Yes.... Check quickly if the link works. I used GoFile to upload.
> 
> Gofile
> 
> Okay link works to here with the Palit GamingPro OC bios


Thx, it worked!


----------



## VPII

Mucho said:


> Thx, it worked!


Let me know how it works for you


----------



## Mucho

VPII said:


> Let me know how it works for you


Yes it worked for me, hitting 2070MHz more stable than before. PL is the same with 109%. The Bios of the NON OC seems to be a little crappy.


----------



## Vapochilled

I have the eagle Oc from gigabyte.
340w is the limite I've seen.
Looking forward to try the Asus TUF oc and see if I can get 370w

I believe my gigabyte has almost a reference board. And I don't have dual bios... So... I'm a bit sceptical about trying this


----------



## VPII

Mucho said:


> Yes it worked for me, hitting 2070MHz more stable than before. PL is the same with 109%. The Bios of the NON OC seems to be a little crappy.


The 109% doesn't seem to do much as your power is still limitted to 320watt and you can see losing clock speed the moment it reach and pass 320watt.


----------



## VPII

Vapochilled said:


> I have the eagle Oc from gigabyte.
> 340w is the limite I've seen.
> Looking forward to try the Asus TUF oc and see if I can get 370w
> 
> I believe my gigabyte has almost a reference board. And I don't have dual bios... So... I'm a bit sceptical about trying this


Upload your Gigabyte eagle oc bios if you can. I have tried 3 different bioses already and my Palit seem to be the best one.


----------



## Mucho

VPII said:


> The 109% doesn't seem to do much as your power is still limitted to 320watt and you can see losing clock speed the moment it reach and pass 320watt.


In Afterburner I´m hitting 350W


----------



## delreylover

Mucho said:


> In Afterburner I´m hitting 350W


how? are you hitting it stable or does it hit 350w for a millisecond then drops clocks and wattage to 310-320w territory? This weird drop happens with me on stock Palit non-oc bios..


----------



## Mucho

delreylover said:


> how? are you hitting it stable or does it hit 350w for a millisecond then drops clocks and wattage to 310-320w territory? This weird drop happens with me on stock Palit non-oc bios..


I don´t know. Afterburner is constantly showing PL and watts are jumping between 340 and 350. I would say Avg 345W. PL slider at max 109%


----------



## Alemancio

pewpewlazer said:


> I wouldn't mind spending $700-800 for a 20% performance boost, but I'm not about to spend even 7-8 minutes stalking the internet for the opportunity to do so.


Exactly this. Time is money and Im done stalking the internet, done wasting my time. I'd even pay 1k for the card, I don't mind but it's the difficulty of getting it that is so frustrating.


----------



## Orbmu2k

Pet_gz said:


> https://fil.email/02Fu498Q
> 
> Asus TUF 3080 Bios ( No OC Version).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please, Asus TUF 3080 *OC Version* Bios???


Thanks for that, can you upload the "Quiet BIOS" too pls? 


EDIT:
attached 3080 MSI Ventus OC


----------



## VPII

Mucho said:


> In Afterburner I´m hitting 350W


Interestingly when I open hardware monitor to show during the Time Spy run it would max out at 330watt and the graph in MSI Afterburner is limited to 320watt? So it your graph limited to 350watt.


----------



## Vapochilled

Could it be that we need to apply that same trick of 1080Ti's where we would need to run a nvidia.exe -pl 375 to set a new power limit ?


----------



## delreylover

I think Nvidia intentionally gimped RTX 3080 to make it look like RTX 3090 is the real bad boy. If our 3080s had a good power limit, it would really complete head-to-head with 3090


----------



## Pet_gz

Pet_gz said:


> https://fil.email/02Fu498Q
> 
> Asus TUF 3080 Bios ( No OC Version).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please, Asus TUF 3080 *OC Version* Bios???





zalo1196 said:


> hi every one this is my asus rtx 3080 tuf oc performance bios enjoy
> 
> 
> 
> 
> 
> 
> 
> 
> Download files - Filemail
> 
> 
> Click here to view and download these shared files from Filemail.com
> 
> 
> 
> 
> fil.email


Thanks!!! Works perfectly.


----------



## Vapochilled

I will update my Gigabyte Eagle OC later. but im curious to see if i can jump the current 340W limit...


----------



## delreylover

Sorry for going kinda off topic but will flashing another BIOS to my GPU void its warranty? Had 3-4 crashes yesterday, i don't want to risk my warranty in case the card it defective

Also, is there any good BIOS for Palit GamingPro 3080?


----------



## Pet_gz

Orbmu2k said:


> Thanks for that, can you upload the "Quiet BIOS" too pls?


https://fil.email/z9ZxWGzW


Tuf 3080 Quiet Bios ( *Warning*: Gpuz shows OC Bios boost clock ( 1785mhz) after flash the other Bios to OC version... but it keeps the same tpd of the No OC version... i dont understand ¿?).


----------



## ChaosBlades

delreylover said:


> Sorry for going kinda off topic but will flashing another BIOS to my GPU void its warranty? Had 3-4 crashes yesterday, i don't want to risk my warranty in case the card it defective
> 
> Also, is there any good BIOS for Palit GamingPro 3080?


Of course it will. 

The only interchangeable BIOS would be if the same card has an OC version. So if your Palit GamingPro 3080 has an OC version with the same exact PCB and you have a non-OC version then you could try the OC BIOS to get a higher power target / higher default clocks. That is if it is stable. Sometimes there is also unlocked BIOS for LN2 overclocking that leak but that is never going to happen for a bottom of the barrel card like the Palit GamingPro. Technically all reference cards BIOS should be interchangeable but I wouldn't risk it unless you see a lot of people reporting success with your exact card with a particular BIOS.


----------



## delreylover

ChaosBlades said:


> Of course it will.
> 
> The only interchangeable BIOS would be if the same card has an OC version. So if your Palit GamingPro 3080 has an OC version with the same exact PCB and you have a non-OC version then you could try the OC BIOS to get a higher power target / higher default clocks. That is if it is stable. Sometimes there is also unlocked BIOS for LN2 overclocking that leak but that is never going to happen for a bottom of the barrel card like the Palit GamingPro. Technically all reference cards BIOS should be interchangeable but I wouldn't risk it unless you see a lot of people reporting success with your exact card with a particular BIOS.


Thanks mate, I think I'll hold on this bios a little longer. maybe one day I'll find the courage to try every single god damn bios and benchmark them


----------



## jexux

Vapochilled said:


> I will update my Gigabyte Eagle OC later. but im curious to see if i can jump the current 340W limit...



What BIOS are you going to test? Thanks.


----------



## Orbmu2k

Pet_gz said:


> https://fil.email/z9ZxWGzW
> 
> 
> Tuf 3080 Quiet Bios ( *Warning*: Gpuz shows OC Bios boost clock ( 1785mhz) after flash the other Bios to OC version... but it keeps the same tpd of the No OC version... i dont understand ¿?).


Thanks! Looks all fine here


----------



## Mucho

My Palit is liquid cooled now. SOTR Bench. Still testing. Running Palit GamePro OC Bios on a Palit Non OC


----------



## Purple_Light

Orbmu2k said:


> Thanks! Looks all fine here


Nice ! Are you stable around 2ghz now with your ventus oc ?


----------



## Orbmu2k

Purple_Light said:


> Nice ! Are you stable around 2ghz now with your ventus oc ?


the powertarget have no effect at all. its still limited at 320W even at 117%.

i try out this bios primary because of instability problems with the stock ventus bios


----------



## Purple_Light

Ok thanks, 

This has to be configured somewhere... I hope it is not into the hardware itself.


----------



## shiokarai

Mucho said:


> View attachment 2460443
> 
> 
> My Palit is liquid cooled now. SOTR Bench. Still testing. Running Palit GamePro OC Bios on a Palit Non OC


Is this 2115Mhz stable or just max/fluctuating?


----------



## Mucho

shiokarai said:


> Is this 2115Mhz stable or just max/fluctuating?


Only tested the SOTR Bench, at this point, it maxed out at 2080, later in the bench it then maxed out to 2115 till the end, but I need to test it in real life gaming.


----------



## VPII

I am somewhat confused, sorry but I keep learning everyday.

What I check the Advanced tab in my CPUz it does not show the power limit only percentages. As can be seen in the attached picture. The 350watt power limit also does not seem to work as my limit on clocks start at 320watt.


----------



## Pet_gz

VPII said:


> I am somewhat confused, sorry but I keep learning everyday.
> 
> What I check the Advanced tab in my CPUz it does not show the power limit only percentages. As can be seen in the attached picture. The 350watt power limit also does not seem to work as my limit on clocks start at 320watt.


Click General and select Nvidia Bios


----------



## VPII

Pet_gz said:


> Click General and select Nvidia Bios


Oh my word, thank you. I am so silly. I knew this but have not used it in ages. Thank you.


----------



## keikei

Do email notifications actually work? I just requested a bunch.


----------



## delreylover

I still dont quite understand why quite a lot of people's cards are stuck around 320w and power limit target doesn't help. That's so weird.


----------



## VPII

delreylover said:


> I still dont quite understand why quite a lot of people's cards are stuck around 320w and power limit target doesn't help. That's so weird.


I would really like to understand this as well. I mean my card has a max powerlimit of 350watt, but even if I set it, clocks would start dropping at 320watt or above. In interesting place to see this is to run the Shadow of Tomb Raider benchmark at 1080P. There are three scenes that would run with both the first and second using a little more than 320watt from time to time, but in the last scene it stays below 320watt and the clocks on my card would be locked at 2085, 2115 or 2130mhz depending on the clocks I'm running.


----------



## shALKE

It is very stranger indeed, I have a similar issue, updated with the Tuf bios and showing power to 375W Max, but while benchmarking/gaming 323.5W is max.


----------



## Anth0789

keikei said:


> Do email notifications actually work? I just requested a bunch.


They actually do since last night I got a notification that there was a Gigabyte in stock at newegg around 10:30PM EDT - But of coarse out of stock in a min.


----------



## keikei

Anth0789 said:


> They actually do since last night I got a notification that there was a Gigabyte in stock at newegg around 10:30PM EDT - But of coarse out of stock in a min.


Thanks.

update: i got one, and within 10 secs it was gone. LMAO.


----------



## Vapochilled

So, its worthless to flash my Gigabyte Eagle OC 3080 with a TUF bios and try to go from 340W to 375W because, even the BIOS flash wont change Powerlimit?
And power slides? Still dont work? If i flash the TUF bios OC, i wont have the powerslide working with asus tweak ? Anyone tested?

Other doubt... i've already asked. 
Do we need to do the same trick like in the pascal series? Where we would need to run a nvidia.exe -pl xxx where xxx is the power limit target ? Because back then, changing ONLY the bios, would also not change the TDP unless you do this. There was even a .bat file available in the pascal thread owners.


----------



## VPII

Vapochilled said:


> So, its worthless to flash my Gigabyte Eagle OC 3080 with a TUF bios and try to go from 340W to 375W because, even the BIOS flash wont change Powerlimit?
> And power slides? Still dont work? If i flash the TUF bios OC, i wont have the powerslide working with asus tweak ? Anyone tested?
> 
> Other doubt... i've already asked.
> Do we need to do the same trick like in the pascal series? Where we would need to run a nvidia.exe -pl xxx where xxx is the power limit target ? Because back then, changing ONLY the bios, would also not change the TDP unless you do this. There was even a .bat file available in the pascal thread owners.


In all honesty I've flashed my Palit GamingPro Co with the following bios and explaining behaviour:

Evga FTW3 Ultra - worked with power limit at 400watt stock - but when I run Time Spy the power usage would immediately go to 390 up to 400+ watt and running Tomb Raider benchmark at 1080P as my clocks would constantly be between 2055 and 2085mhz the power drawn would be 380 to 390watt with clocks in the low 1900 into the 1800mhz range.

Asus Tuf - worked almost exactly as the Palit bios with clocks dropping when reaching 320 watt or higher. But performance was a little worse.

Asus Tuf OC - worked almost exactly as the Palit bios with clocks dropping when reaching 320 watt or higher. But performance was a little worse.

So ritgh now, I do believe that even the +9% on power to take it to 350watt I have is not really working as you will lose clocks when your reach 320watt. I tested this with 3dmark Time Spy usually using around 320 to 335watt by increasing my clock speed to +165mhz giving me 2220mhz max clocks and I was able to run the benchmark without an issue and clocks would still drop into the 1900mhz range but mostly be in the 2000mhz range. This gave me a average clock speed of 1986mhz



https://www.3dmark.com/spy/14243859



Now there is no way that these clocks would be stable if you have the power to keep it there.


----------



## Vapochilled

VPII said:


> In all honesty I've flashed my Palit GamingPro Co with the following bios and explaining behaviour:
> 
> Evga FTW3 Ultra - worked with power limit at 400watt stock - but when I run Time Spy the power usage would immediately go to 390 up to 400+ watt and running Tomb Raider benchmark at 1080P as my clocks would constantly be between 2055 and 2085mhz the power drawn would be 380 to 390watt with clocks in the low 1900 into the 1800mhz range.
> 
> Asus Tuf - worked almost exactly as the Palit bios with clocks dropping when reaching 320 watt or higher. But performance was a little worse.
> 
> Asus Tuf OC - worked almost exactly as the Palit bios with clocks dropping when reaching 320 watt or higher. But performance was a little worse.
> 
> So ritgh now, I do believe that even the +9% on power to take it to 350watt I have is not really working as you will lose clocks when your reach 320watt. I tested this with 3dmark Time Spy usually using around 320 to 335watt by increasing my clock speed to +165mhz giving me 2220mhz max clocks and I was able to run the benchmark without an issue and clocks would still drop into the 1900mhz range but mostly be in the 2000mhz range. This gave me a average clock speed of 1986mhz
> 
> 
> 
> https://www.3dmark.com/spy/14243859
> 
> 
> 
> Now there is no way that these clocks would be stable if you have the power to keep it there.


I dont have balls to flash a FTW3 bios (3x 8pin power) on a 2x pin power like my gigabyte Eagle OC. 
I dont have dual BIOS...  So... unless someone test it 1st .. im not going to try hahahah 

Lucky you that a 3x power works on a 2x power pcb. .. maybe thays why you're getting 400w


----------



## Vapochilled

BTW, the gigabyte Gaming OC is 370W !!!!! 
That one i would take the risk. 
Anyone with a Gigabyte Gaming OC could share the bios ?


----------



## VPII

Vapochilled said:


> I dont have balls to flash a FTW3 bios (3x 8pin power) on a 2x pin power like my gigabyte Eagle OC.
> I dont have dual BIOS...  So... unless someone test it 1st .. im not going to try hahahah
> 
> Lucky you that a 3x power works on a 2x power pcb. .. maybe thays why you're getting 400w


Maybe, but at least I am back with my baby's original bios and she's happy. Ha ha ha


----------



## VPII

Vapochilled said:


> BTW, the gigabyte Gaming OC is 370W !!!!!
> That one i would take the risk.
> Anyone with a Gigabyte Gaming OC could share the bios ?


One member mentioned earlier having it and I asked to share but no reply.


----------



## Vapochilled

VPII said:


> One member mentioned earlier having it and I asked to share but no reply.


Do you recall who it was ? I could pm


----------



## VPII

Vapochilled said:


> Do you recall who it was ? I could pm


It was a page or so back, cannot recall


----------



## Vapochilled

Rbk_3 said:


> Was able to get the Gigabyte Gaming OC. Couple things.
> I got 4k 120 to work on my LG C9 but I can’t get Gsync to work with it. The screen goes black when I launch a game.
> 
> Also getting some frametime issues in Warzone that I wasn’t getting with my 1080ti. Nothing major, but some little spikes from 7ms to 13ms that is enough to annoy me. Regular MP seems fine. I did a fresh windows install, updated my bios and even got some new Ram, removed my OCs etc but it is still an issue.
> 
> Also, I can’t move the power slider past 100% in Afterburner. My old card I could go to 117%


Could you post your Gigabyte Gaming OC Bios?


----------



## Vapochilled

ElectroManiac said:


> Gigabyte


Did you got the Gaming OC already ? Could you post your BIOS here using the new nvflash ampere?


----------



## Mucho

Borderlands 3 Bench. I think with a PL of 370W I would get a stable 2100MHz OC.


----------



## doom26464

keikei said:


> Thanks.
> 
> update: i got one, and within 10 secs it was gone. LMAO.


There useless.

I got a notification for asus tuf 3080 in stock on newegg as I was sitting down for supper the other day. Time I got to newegg page, they where already out of stock.

Like I posted in another thread something in the factorys in china are putting a delay on these cards, something in the pipeline is delaying stock. Should be no excuses for this half ass launch of maybe a few 1000 units worldwide when a single AIB factory in china is capable of 1 mill units per month.


----------



## Riadon

http://www.filedropper.com/gigabytegamingoc



Here you go, hard limit is 370w and soft limit seems to be around 350-355w (the point at which the clocks and voltage start dropping).


----------



## keikei

doom26464 said:


> There useless.
> 
> I got a notification for asus tuf 3080 in stock on newegg as I was sitting down for supper the other day. Time I got to newegg page, they where already out of stock.
> 
> Like I posted in another thread something in the factorys in china are putting a delay on these cards, something in the pipeline is delaying stock. Should be no excuses for this half ass launch of maybe a few 1000 units worldwide when a single AIB factory in china is capable of 1 mill units per month.


The supply chain hasnt changed since i've been in the game it seems. All it took was another surge in demand (for watever reason) and we get another 'paper launch'. Every retailer is getting drip-feed cards. Gamers should have cards by 2nd half of 2021.


----------



## Mucho

@VPII Are you gonna test the Gigabyte Gaming OC bios on the Palit?


----------



## doom26464

keikei said:


> The supply chain hasnt changed since i've been in the game it seems. All it took was another surge in demand (for watever reason) and we get another 'paper launch'. Every retailer is getting drip-feed cards. Gamers should have cards by 2nd half of 2021.


My guess is not enough GDDRx6 memory modules or yields on GA102 die is pretty low still. Nvidia probaly got the handful of what they could from august production and part of september and decide to just rush to launch while waiting for yeilds to improve and supply to increase.


----------



## VPII

Mucho said:


> @VPII Are you gonna test the Gigabyte Gaming OC bios on the Palit?


I will most certainly if I can get it.


----------



## Riadon

VPII said:


> I will most certainly if I can get it.





http://www.filedropper.com/gigabytegamingoc


----------



## delreylover

Mucho said:


> View attachment 2460477
> View attachment 2460478
> 
> 
> Borderlands 3 Bench. I think with a PL of 370W I would get a stable 2100MHz OC.


This is very impressive. At 100% load my card can only get around 1780-1800mhz and it usually drops to 0.900v... damn, is it silicon lottery or bad bios?


----------



## VPII

Riadon said:


> http://www.filedropper.com/gigabytegamingoc


Tested and I am somewhat impressed. This cards base clocks is 60mhz higher than the Palit card I have as in 2055+ 60mhz so 2115mhz stock. When I ran I saw that still 320watt is sort of the cut off limit for clocks where it starts dropping. I increased the clocks by 105mhz and ran Shadow or Tomb Raider and it passed with 2160mhz alsmost the entire time during the last scene. The lowest the clocks went was 2025 but just briefly. Still it seems good. I will try it tomorrow with Time Spy to see what it does.


----------



## delreylover

VPII said:


> Tested and I am somewhat impressed. This cards base clocks is 60mhz higher than the Palit card I have as in 2055+ 60mhz so 2115mhz stock. When I ran I saw that still 320watt is sort of the cut off limit for clocks where it starts dropping. I increased the clocks by 105mhz and ran Shadow or Tomb Raider and it passed with 2160mhz alsmost the entire time during the last scene. The lowest the clocks went was 2025 but just briefly. Still it seems good. I will try it tomorrow with Time Spy to see what it does.


So it still dropped clocks at 320W? Damn, I can't understand why it happens...


----------



## VPII

delreylover said:


> So it still dropped clocks at 320W? Damn, I can't understand why it happens...


Yes it does.... I think it is build into the hardware if you ask me.


----------



## Mucho

VPII said:


> Yes it does.... I think it is build into the hardware if you ask me.


i don`t think so. My Palit is hitting 350 the whole time. In some parts of the ebnch it drops down to 330W, but just a sec or two. 
So the Gaming OC Bios is working on the Palit? 

@delreylover The bios of the GamingPro Non OC is crap, use the GaminPro OC bios VPII uploaded here.


----------



## VPII

Mucho said:


> i don`t think so. My Palit is hitting 350 the whole time. In some parts of the ebnch it drops down to 330W, but just a sec or two.
> So the Gaming OC Bios is working on the Palit?
> 
> @delreylover The bios of the GamingPro Non OC is crap, use the GaminPro OC bios VPII uploaded here.


Well I seem to be not that fortunate. Do me a favour and run Time Spy and link the result. I'd like to see your average clocks as Time Spy would pull 320 to 335watt so your clocks should remain constant.

Sent from my SM-G960F using Tapatalk


----------



## Mucho

VPII said:


> Well I seem to be not that fortunate. Do me a favour and run Time Spy and link the result. I'd like to see your average clocks as Time Spy would pull 320 to 335watt so your clocks should remain constant.
> 
> Sent from my SM-G960F using Tapatalk


Already flashed the Gaming OC onto the Palit 😂

@VPII Please test the middle DP Port, mine isn´t working, left an right are working. HDMI is working, too.


----------



## VPII

Mucho said:


> Already flashed the Gaming OC onto the Palit
> 
> @VPII Please test the middle DP Port, mine isn´t working, left an right are working. HDMI is working, too.


Sorry already flashed back. Ill probably stick with Palit. Just not sure how some people see 350 watt used when they highest I saw was 335watt.

Sent from my SM-G960F using Tapatalk


----------



## Mucho

Slider of the PL shows me only 100%. Can´t max out the slider. Same problem?


----------



## VPII

Mucho said:


> Slider of the PL shows me only 100%. Can´t max out the slider. Same problem?


For the GB gaming oc bios that is it.

Sent from my SM-G960F using Tapatalk


----------



## Reinhardovich773

Greetings to both @Mucho and @VPII! I just wanted to ask whether you guys tried flashing in the new Palit RTX 3080 GamingPro OC just released today:








::Palit Products - GeForce RTX™ 3080 GamingPro OC ::


Palit GeForce RTX™ 3080 GamingPro OC delivers stunning visuals, incredibly fast frame rates, and AI acceleration for gaming and creative applications. Equipped with Triple A Die Casting Plate Kit and Advanced TurboFan 3.0, the model brings amazing thermal performance. Combining an iron-black and...




www.palit.com




Did you guys test your cards with this new VBIOS or are you just using the one that came with the card? Thanks in advance for any potential info as the Palit 3080 GamingPro OC is the card that i ordered and is supposed to arrive to me early next week hopefully, and i wanted to gather as much info as possible about the card haha.

EDIT: BTW the old Palit VBIOS number is: *94.02.26.50.90* and the newest one is: *94.02.26.08.8A*. Sadly there are no release notes in the Palit website...


----------



## VPII

Reinhardovich773 said:


> Greetings to both @Mucho and @VPII! I just wanted to ask whether you guys tried flashing in the new Palit RTX 3080 GamingPro OC just released today:
> 
> 
> 
> 
> 
> 
> 
> 
> ::Palit Products - GeForce RTX™ 3080 GamingPro OC ::
> 
> 
> Palit GeForce RTX™ 3080 GamingPro OC delivers stunning visuals, incredibly fast frame rates, and AI acceleration for gaming and creative applications. Equipped with Triple A Die Casting Plate Kit and Advanced TurboFan 3.0, the model brings amazing thermal performance. Combining an iron-black and...
> 
> 
> 
> 
> www.palit.com
> 
> 
> 
> 
> Did you guys test your cards with this new VBIOS or are you just using the one that came with the card? Thanks in advance for any potential info as the Palit 3080 GamingPro OC is the card that i ordered and is supposed to arrive to me early next week hopefully, and i wanted to gather as much info as possible about the card haha.
> EDIT: old Palit VBIOS number is: 94.02.26.50.90 and the newest one is: 94.02.26.08.8A. Sadly there are no release notes in the Palit website...


Nope but thank you so much for sharing. Will try now.

Sent from my SM-G960F using Tapatalk


----------



## Reinhardovich773

VPII said:


> Nope but thank you so much for sharing. Will try now.
> 
> Sent from my SM-G960F using Tapatalk


You're most welcome! Please do try to keep us updated if you notice any changes whatsoever in the card's behaviour (fan and voltage/frequency curves, thermals and acoustics) if possible. Thanks in advance!


----------



## shALKE

nvflash64 Version 5.665.0





nvflash64_Version_5.665.zip







drive.google.com




I extracted it from the Palit bios update. The bios file is password protected.


----------



## delreylover

Reinhardovich773 said:


> Greetings to both @Mucho and @VPII! I just wanted to ask whether you guys tried flashing in the new Palit RTX 3080 GamingPro OC just released today:
> 
> 
> 
> 
> 
> 
> 
> 
> ::Palit Products - GeForce RTX™ 3080 GamingPro OC ::
> 
> 
> Palit GeForce RTX™ 3080 GamingPro OC delivers stunning visuals, incredibly fast frame rates, and AI acceleration for gaming and creative applications. Equipped with Triple A Die Casting Plate Kit and Advanced TurboFan 3.0, the model brings amazing thermal performance. Combining an iron-black and...
> 
> 
> 
> 
> www.palit.com
> 
> 
> 
> 
> Did you guys test your cards with this new VBIOS or are you just using the one that came with the card? Thanks in advance for any potential info as the Palit 3080 GamingPro OC is the card that i ordered and is supposed to arrive to me early next week hopefully, and i wanted to gather as much info as possible about the card haha.
> 
> EDIT: BTW the old Palit VBIOS number is: *94.02.26.50.90* and the newest one is: *94.02.26.08.8A*. Sadly there are no release notes in the Palit website...


Wow thank you, sir! I wouldn't notice it myself if it werent for you! I emailed Palit about the power issues and how they updated their 2080ti with higer power limits, they haven't replied me yet but hopefully they acknowledged the issues.


----------



## VPII

Reinhardovich773 said:


> Greetings to both @Mucho and @VPII! I just wanted to ask whether you guys tried flashing in the new Palit RTX 3080 GamingPro OC just released today:
> 
> 
> 
> 
> 
> 
> 
> 
> ::Palit Products - GeForce RTX™ 3080 GamingPro OC ::
> 
> 
> Palit GeForce RTX™ 3080 GamingPro OC delivers stunning visuals, incredibly fast frame rates, and AI acceleration for gaming and creative applications. Equipped with Triple A Die Casting Plate Kit and Advanced TurboFan 3.0, the model brings amazing thermal performance. Combining an iron-black and...
> 
> 
> 
> 
> www.palit.com
> 
> 
> 
> 
> Did you guys test your cards with this new VBIOS or are you just using the one that came with the card? Thanks in advance for any potential info as the Palit 3080 GamingPro OC is the card that i ordered and is supposed to arrive to me early next week hopefully, and i wanted to gather as much info as possible about the card haha.
> 
> EDIT: BTW the old Palit VBIOS number is: *94.02.26.50.90* and the newest one is: *94.02.26.08.8A*. Sadly there are no release notes in the Palit website...


Hi ther thanks. Same result though clocks still drop at 320watt or above but I get same results so it is fine.


----------



## delreylover

VPII said:


> Hi ther thanks. Same result though clocks still drop at 320watt or above but I get same results so it is fine.


Power slider still doesn't work? Damn.


----------



## Mucho

delreylover said:


> Wow thank you, sir! I wouldn't notice it myself if it werent for you! I emailed Palit about the power issues and how they updated their 2080ti with higer power limits, they haven't replied me yet but hopefully they acknowledged the issues.


Well the Update is a .exe. So I think, flashing the Palit OC Bios from VPII first is the way to go. After that the .exe should update to the new bios.


----------



## delreylover

Mucho said:


> Well the Update is a .exe. So I think, flashing the Palit OC Bios from VPII first is the way to go. After that the .exe should update to the new bios.


Alright, I see. Thanks!


----------



## Reinhardovich773

delreylover said:


> Wow thank you, sir! I wouldn't notice it myself if it werent for you! I emailed Palit about the power issues and how they updated their 2080ti with higer power limits, they haven't replied me yet but hopefully they acknowledged the issues.


You're most welcome, sir! I did not find out about the new VBIOS through Palit but rather through a post in Nvidia's official subreddit. I think Palit do a poor job when it comes to communication with their customers haha!
About flashing the VBIOS in your card, it should absolutely cause no problems as both cards use the exact same PCB, with the only possible difference being in the binning of the GPU chips themselves. But in the end we're all playing the Sillicon Lottery™ game so you could possibly end up with a card that overclocks even better than the OC variant 😁and what better way to find out than flashing a more potent VBIOS into the card.
Anyway, i wish you good luck in the flashing process should you wish to undertake it. And do please try to keep us updated if you notice something different regarding the card's behaviour. Cheers!


----------



## Reinhardovich773

VPII said:


> Hi ther thanks. Same result though clocks still drop at 320watt or above but I get same results so it is fine.


You're most welcome! The power slider matter is a bit disappointing though if it's confirmed to be an issue with the card. I'm thinking it could be an insufficient voltage issue. Because if you set a lower voltage in the voltage/frequency curve in a program like MSI Afterburner, then you can effectively reduce the card's power consumption at the cost of lower boost clocks. On the other hand, if you raise voltages, you should be able to make the card more stable at higher boost blocks, not to mention that it'll also consume more power. Maybe for example you need a core voltage superior to say 1.061V in order to fully max out the 350W TBP? I still don't have the card so i can't say for sure but i'm just speculating based on previous experience with cards like the GTX 970 and RTX 2070 that i previously owned.


----------



## Mucho

So I flashed the Palit OC bios by VPII and turned the card into the OC Version. After that I started the .exe and updated the bios. Goin to test it now a bit. Strangely, even with the Gigabyte OC bios, the card was maxing out at 350W, its the same with your cards maxing out at 320W. Maybe all the tools aren´t working properly yet. 
With the Gigabyte OC bios I got 2145MHz in SOTR not falling under 2080MHZ.


----------



## Reinhardovich773

VPII said:


> Hi ther thanks. Same result though clocks still drop at 320watt or above but I get same results so it is fine.


Oh ok i see. Thanks so much for your valuable feedback. Much appreciated!


----------



## Reinhardovich773

Mucho said:


> So I flashed the Palit OC bios by VPII and turned the card into the OC Version. After that I started the .exe and updated the bios. Goin to test it now a bit. Strangely, even with the Gigabyte OC bios, the card was maxing out at 350W, its the same with your cards maxing out at 320W. Maybe all the tools aren´t working properly yet.
> With the Gigabyte OC bios I got 2145MHz in SOTR not falling under 2080MHZ.


Those are some very impressive numbers! BTW did you test SotTR with RT effects ON? Because you have to keep in mind that games with RT effects are *a lot *less stable and more finicky when it comes to overclocking. Your OC could be stable in 99%+ of the games you play, but the moment you play a game with heavy RT effects (like Control), you'd be lucky to keep the game from crashing in under 15 min of gameplay. And i'm speaking from experience here with my previous Gigabyte RTX 2070 Gaming OC.


----------



## Somandarin

A bit out of the loop w.r.t GPUs lately. Is nVidia releasing a super series of these cards as well? If so, when?


----------



## keikei

Somandarin said:


> A bit out of the loop w.r.t GPUs lately. Is nVidia releasing a super series of these cards as well? If so, when?











GALAX's internal roadmap confirms GeForce RTX 3080 20GB, GeForce RTX 3060 - VideoCardz.com


A leaked Galax roadmap confirms the upcoming GeForce RTX 30 series graphics cards. GALAX confirms GeForce RTX 3080 20GB, GeForce RTX 3060 The company has held a meeting with either employees or key partners in China. The logo in the upper left corner confirms that the slide belongs to Galax. The...




videocardz.com


----------



## delreylover

Is it possible to downgrade BIOS? I want to try the Palit OC BIOS that @VPII posted earlier. Though it's now an old version since I updated to the one Palit officially posted.


----------



## Mucho

Yes, via Nvflash downgrading should be possible


----------



## eeroo94

I tried TUF bios on Ventus, but same story it hits power limit at 320w.


----------



## 6u4rdi4n

Arrived today!


----------



## Vapochilled

Same here. Flashed gigabyte gaming oc 370w on my eagle Oc 340w.
I still don't see values above 340w....


----------



## i core

Riadon said:


> http://www.filedropper.com/gigabytegamingoc


thanks for sharing


----------



## Mucho

Vapochilled said:


> Same here. Flashed gigabyte gaming oc 370w on my eagle Oc 340w.
> I still don't see values above 340w....


But Freq is going up?


----------



## Vapochilled

Mucho said:


> But Freq is going up?


I had 9530 score graphics in 3dmark spy extreme. That's the same of 3090 lolololol

Curious thing is...
Did the nvidia.smi.exe -pl 370
And saw 365w on 3dmark


----------



## Riadon

Vapochilled said:


> I had 9530 score graphics in 3dmark spy extreme. That's the same of 3090 lolololol
> 
> Curious thing is...
> Did the nvidia.smi.exe -pl 370
> And saw 365w on 3dmark


Think this might work with triple 8-pin connector bios as well? i.e FTW3 bios on my gaming oc, which I know doesn't work properly without this trick.


----------



## Vapochilled

Riadon said:


> Think this might work with triple 8-pin connector bios as well? i.e FTW3 bios on my gaming oc, which I know doesn't work properly without this trick.


Got lost in your comment. You have a gaming oc gigabyte that is 2x pin and you flashed the 3x pin EVGA?
1. Did that crazy worked?
2. Your tdp only worked after this nvidia smi trick?


----------



## Riadon

Vapochilled said:


> Got lost in your comment. You have a gaming oc gigabyte that is 2x pin and you flashed the 3x pin EVGA?
> 1. Did that crazy worked?
> 2. Your tdp only worked after this nvidia smi trick?


No I didn't flash the bios. Someone else (post #578) flashed the FTW3 bios to a Palit card and claimed it wasn't working properly as non-power limited benchmarks, which previously pulled in the low 300s wattage, immediately showed as pulling 400w with the new bios but without a corresponding increase in clock speed meaning the power limit wasn't actually being increased. Curious to know if the trick you posted would remedy that problem and actually lead to increase in power limit.


----------



## Mucho

Open a new .txt file, add this and change name to power.bat

cd\
C:
CD C:\Program Files\NVIDIA Corporation\NVSMI
nvidia-smi.exe -q -d power
pause

Edit:
Flashed the 370W bios. Bat file shows me that 370W is selected. But after benching the bat file tells me power draw was 352W. Why am I not able to get 370W out of the card?


----------



## Riadon

Mucho said:


> Open a new .txt file, add this and change name to power.bat
> 
> cd\
> C:
> CD C:\Program Files\NVIDIA Corporation\NVSMI
> nvidia-smi.exe -q -d power
> pause
> 
> Edit:
> Flashed the 370W bios. Bat file shows me that 370W is selected. But after benching the bat file tells me power draw was 352W. Why am I not able to get 370W out of the card?


My gaming oc with stock bios (370w) doesn't ever hit 370w either, the cards seem to be built to start backing down on voltage when they come within 5% of the power limit or something because I max at around 352w-355w before I hit the "soft" limit and start losing clockspeed.


----------



## Mucho

Mine is between 345w - 352w. Would be nice to know, if other people are able to hit 370w?


----------



## ELCID777

Hello, I unfortunately have the Zotac Trinity 3080 and it has a power limit of 105. Has anyone tried flashing to a different bios that's compatible with this card? I wanted to try the FE bios, but not sure it will work.


----------



## VPII

ELCID777 said:


> Hello, I unfortunately have the Zotac Trinity 3080 and it has a power limit of 105. Has anyone tried flashing to a different bios that's compatible with this card? I wanted to try the FE bios, but not sure it will work.


I tried the FE bios on my Palit card and it did not work. So not sure.


----------



## edu616

Greetings,

I'm new here is any of this safe for the FE 3080. I have always like doing oc's using power tables for AMD cards. Is this process similar and will it increase overall power so I can get higher clocks? Thanks.


----------



## cstkl1

Mucho said:


> Mine is between 345w - 352w. Would be nice to know, if other people are able to hit 370w?


tuf 375 on spike. constant 34x-35x


----------



## VPII

You guys are lucky, the highest power draw I have seen with the Palit GamingPro OC has been 335watt range and clocks would drop the moment it gets close to 320watt even with the added 9% power limit which basically means 350watt or there about.


----------



## VPII

Okay I tested in SotR bench now with the power limit left at 100% to see where the power peaked with detail at full 1440P and it peaked at 327watt, although it was more like 322 or so as the 327 was a spike at the end. Then I increased the power limit by the 9% which effectively mean 350watt, ran the bench again and then it peaked at 333watt or there about. Will do the same now with Time Spy as it does draw more power.


----------



## cstkl1

hmm nobody read the power limit entry in nvflash64...


----------



## cstkl1

i think you need to flash the inforom to get the powerlimit like tuf..

my guess..

cause the pp limit etc seems to stored in the inforom..

again just guessing...


----------



## Talon2016

cstkl1 said:


> i think you need to flash the inforom to get the powerlimit like tuf..
> 
> my guess..
> 
> cause the pp limit etc seems to stored in the inforom..
> 
> again just guessing...


Yes I've been wondering what that is when I flash, how do we flash the inforom?


----------



## VPII

Talon2016 said:


> Yes I've been wondering what that is when I flash, how do we flash the inforom?


--flashinforom in the line when flashing as per nvflash... Tried it and did nothing, will try again.


----------



## Talon2016

VPII said:


> --flashinforom in the line when flashing as per nvflash... Tried it and did nothing, will try again.


Thanks for trying. I've tried all of the vBIOS except for the FE and none of them worked. I think the Asus OC and gigabyte gave me about 10 more watts but really not worth it IMO. I did manage to break 12K on Port Royal though. They really did something to prevent vBIOS cross flashing it seems. Almost makes you want to go AMD for those that like to tweak. Nvidia really taking all of the fun out of overclocking IMO. Might be time to move to AMD if they are able to compete.


----------



## VPII

Talon2016 said:


> Thanks for trying. I've tried all of the vBIOS except for the FE and none of them worked. I think the Asus OC and gigabyte gave me about 10 more watts but really not worth it IMO. I did manage to break 12K on Port Royal though. They really did something to prevent vBIOS cross flashing it seems. Almost makes you want to go AMD for those that like to tweak. Nvidia really taking all of the fun out of overclocking IMO. Might be time to move to AMD if they are able to compete.


I hear you, it is somewhat depressing. I broke 12K in Port Royal on the 24th of September but only just. Still need to do some fine tuning to get it working with my overclocks. I mean I push my GPU up to 2220mhz max in Time Spy because the power limit drop the clocks instantly so there is no way it would handle those clocks and then my average clocks is like 1981mhz during the run.


----------



## i core

Mucho said:


> Mine is between 345w - 352w. Would be nice to know, if other people are able to hit 370w?


mine hitting around 364w max in gta V. on average is about 356w in gta v. i didnt see hitting 370w ceiling


----------



## Pet_gz

Furmark v 1.22.0 for RTX 3xxx

Peaks ( OC +150/170mhz): 2190 mhz - 1.100v. - 367.6w.










Best run 4k preset:


----------



## VPII

Pet_gz said:


> Furmark v 1.22.0 for RTX 3xxx
> 
> Peaks: 2190 mhz - 1.100v. - 367.6w.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Best run 4k preset:


How do you run Furmakr without the gpu going way down to below even base clocks. Mine just drop to 1700 or 1600mhz immediately.


----------



## Pet_gz

VPII said:


> How do you run Furmakr without the gpu going way down to below even base clocks. Mine just drop to 1700 or 1600mhz immediately.


Running furmark bench, average clock 1540-1560mhz normally ( OC +150/170 mhz)


----------



## VPII

Pet_gz said:


> Running furmark bench, average clock 1540-1560mhz normally


Okay, so what do you run to see 2190mhz clocks.


----------



## Pet_gz

VPII said:


> Okay, so what do you run to see 2190mhz clocks.


Furmark Stress test shows boost clock, bench shows core clock.


----------



## Vapochilled

Pet_gz said:


> Furmark Stress test shows boost clock, bench shows core clock.


What Graphic result do you get on 3dmark Timespy Extreme?
Funny thing is that my results came up from 9100 to 9500 by changing bios from Eagle OC to Gaming OC and applying the same MSI AB voltage / speed curve.


----------



## Pet_gz

Vapochilled said:


> What Graphic result do you get on 3dmark Timespy Extreme?
> Funny thing is that my results came up from 9100 to 9500 by changing bios from Eagle OC to Gaming OC and applying the same MSI AB voltage / speed curve.


Only basic edition Time Spy, best run ( previous drivers and Tuf No OC Bios):










With new drivers, OC Bios and highest OC/Voltage - Worst score.... 😅😂🤣


----------



## VPII

Pet_gz said:


> Furmark Stress test shows boost clock, bench shows core clock.


Sorry, maybe I am missing something here. In Furmark it would show my clocks dropping into 1700 sometimes 1600mhz range. Did not check in MSI AB HW monitor but felt it was clear that the clocks dropped like crazy.


----------



## Vapochilled

VPII said:


> Sorry, maybe I am missing something here. In Furmark it would show my clocks dropping into 1700 sometimes 1600mhz range. Did not check in MSI AB HW monitor but felt it was clear that the clocks dropped like crazy.



Undervolt with MSI AB curve.
You can do 1880 with 0.88 easy or 1900 with 0.9v. That will lower the power usage


----------



## delreylover

I wonder if we ever will be able to flash BIOS with working power limits.


----------



## VPII

Vapochilled said:


> Undervolt with MSI AB curve.
> You can do 1880 with 0.88 easy or 1900 with 0.9v. That will lower the power usage


But that is not wat I want. I decided to run Heaven for an hour and great at least my clocks were 90% of the time way above 2000mhz and it was stable.


----------



## delreylover

VPII said:


> But that is not wat I want. I decided to run Heaven for an hour and great at least my clocks were 90% of the time way above 2000mhz and it was stable.


Did you overclocked it? Because my Palit non-oc maxes out around 1935-1960mhz. Never over 2000mhz without manually overclocking


----------



## VPII

delreylover said:


> Did you overclocked it? Because my Palit non-oc maxes out around 1935-1960mhz. Never over 2000mhz without manually overclocking


Hey my Palit is overclocked out of the box and even at stock setting it would drop right into the 1500mhz range. It is a load of junk to say the least.


----------



## shALKE

Isn't the base clock 1710 Mhz and if its OC out of the box usually by 60-75 Mhz more? Why would it drop below base clock if you don't have thermal issues?


----------



## delreylover

shALKE said:


> Isn't the base clock 1710 Mhz and if its OC out of the box usually by 60-75 Mhz more? Why would it drop below base clock if you don't have thermal issues?


Mine also drops to around 1650mhz. No thermal issues, card is around 60c. I think its because of power limits. It's too low.


----------



## VoRtAn

delreylover said:


> Mine also drops to around 1650mhz. No thermal issues, card is around 60c. I think its because of power limits. It's too low.


People having boost under nvidia specs can please take some printscreens with voltage curve from afterburner + gpu-z sensor + afterburner monitor ?
Something like this + voltage curve on furmark for example








3080-456-55


Image 3080-456-55 hosted in ImgBB




imgbb.com


----------



## delreylover

VoRtAn said:


> People having boost under nvidia specs can please take some printscreens with voltage curve from afterburner + gpu-z sensor + afterburner monitor ?
> Something like this + voltage curve on furmark for example
> 
> 
> 
> 
> 
> 
> 
> 
> 3080-456-55
> 
> 
> Image 3080-456-55 hosted in ImgBB
> 
> 
> 
> 
> imgbb.com


I will but the thing is, on the screenshot the your gpu usage is lower than 99%. In that case, my GPU also exceeds the Nvidia specs. When i push the card to 99%, for example 4k ultra rdr2, the card hits 99% usage and throttles back because of power limits.


----------



## VoRtAn

delreylover said:


> I will but the thing is, on the screenshot the your gpu usage is lower than 99%. In that case, my GPU also exceeds the Nvidia specs. When i push the card to 99%, for example 4k ultra rdr2, the card hits 99% usage and throttles back because of power limits.


That's just an example for the test for the low clocks that you guys are reporting , that screenshot was to test if the vga was crashing with high boost on new driver.


----------



## delreylover

VoRtAn said:


> People having boost under nvidia specs can please take some printscreens with voltage curve from afterburner + gpu-z sensor + afterburner monitor ?
> Something like this + voltage curve on furmark for example
> 
> 
> 
> 
> 
> 
> 
> 
> 3080-456-55
> 
> 
> Image 3080-456-55 hosted in ImgBB
> 
> 
> 
> 
> imgbb.com


Hello,
I did it on RDR2 because it's the game i experience most problems with boost clocks.
Btw, this is with the stock Palit GamingPro non-OC BIOS, updated with the one on the Palit website.

Here, this is with stock settings. Default fan curve, no oc, 100% power limit (320W)








Screenshot-4


Image Screenshot-4 hosted in ImgBB




ibb.co




Another one with same scenerio (default everything)








Screenshot-5


Image Screenshot-5 hosted in ImgBB




ibb.co





This one is without any overclock, default fan curve, but with 109% power limit applied (350W)








109


Image 109 hosted in ImgBB




ibb.co





Sorry, I dont know how to embed pictures, so I had to use ibb.co


----------



## VoRtAn

Resolution that you use ?
You power limit is hitting hard and downclocking the card.
gpu voltage always and well bellow 1.000mv, it was always like this?
Can you crank up the voltage in afterburner settings ??

Thanks.


----------



## Mucho

A user from a german forum tried both the Gaming OC and the EVGA FTW3 bios on his MSI Ventus x3 OC. With the EVGA bios he is hitting 400W with no gain in Freq. With the Gaming OC bios he is running into PL at 320W like most users here. I don´t understand why my card hits 350W with the same card and bios, but other users seem to have problems hitting everything beyond 320W?


----------



## delreylover

Sorry, btw I forgot to add the voltage curve
Here is the voltage curve:








Screenshot-6


Image Screenshot-6 hosted in ImgBB




ibb.co







VoRtAn said:


> Resolution that you use?


1080p monitor with in-game scaling set to 2x - so it's rendering at 4k



VoRtAn said:


> gpu voltage always and well bellow 1.000mv, it was always like this?


Unfortunately, yes. When under 90-99% load, it's always below 1.000mV




VoRtAn said:


> Can you crank up the voltage in afterburner settings ??


I have tried it before but I think it made zero changes. I will try again and post if anything changes

Thanks.


----------



## delreylover

Mucho said:


> A user from a german forum tried both the Gaming OC and the EVGA FTW3 bios on his MSI Ventus x3 OC. With the EVGA bios he is hitting 400W with no gain in Freq. With the Gaming OC bios he is running into PL at 320W like most users here. I don´t understand why my card hits 350W with the same card and bios, but other users seem to have problems hitting everything beyond 320W?


Yes. My card barely goes over 320W and when it goes, the frequencies and voltages drop a lot and also it never exceeds 327-330W. Palit GamingPro non-OC with 109% PL
And maybe can it be false reporting of power usage? I mean the 400W one


----------



## Shadowdane

delreylover said:


> Hello,
> I did it on RDR2 because it's the game i experience most problems with boost clocks.
> Btw, this is with the stock Palit GamingPro non-OC BIOS, updated with the one on the Palit website.
> 
> Here, this is with stock settings. Default fan curve, no oc, 100% power limit (320W)


Pulled your pictures out of the reply quote.

I think your extremely CPU limited in your scenario? I can't think of any reason the FPS is soo low especially as your just running 1080p! Here is the same area on my 2080Ti @ 1440p. I am using optimized settings though most settings on Ultra but a few dropped down to High.

[edit]
Oh i saw your other reply that your running 1080p with display scaling to 4K. Yah your screenshots don't look downscaled at all, way too much sharpening. Maybe try lower scaling to get closer to 60fps. Or just turn down some settings if you cranked everything to max.


----------



## delreylover

Shadowdane said:


> Pulled your pictures out of the reply quote.
> 
> I think your extremely CPU limited in your scenario? I can't think of any reason the FPS is soo low especially as your just running 1080p! Here is the same area on my 2080Ti @ 1440p. I am using optimized settings though most settings on Ultra but a few dropped down to High.


I'm running the game on 4K. 1080p monitor with 2x res scale hits 4K. Besides, everything is maxed out, including water physics and yes, extremely limited by CPU but still performs fine, I guess..


----------



## VoRtAn

delreylover said:


> I'm running the game on 4K. 1080p monitor with 2x res scale hits 4K. Besides, everything is maxed out, including water physics and yes, extremely limited by CPU but still performs fine, I guess..


Can you make same testing on 1080p only ?

Beside that, check if you can raise voltage like said before.


----------



## delreylover

VoRtAn said:


> Can you make same testing on 1080p only ?
> 
> Beside that, check if you can raise voltage like said before.


welp  GPU crashed when changing resolutions. tried the voltage tho and unfortunately, zero effect. very expected since the card can't even reach 0.900 mV due to power choking and barely stays around 0.887mV....
Will edit this post after I test on 1080p

Edit: Tested with 1080p and yes, since my performance is now CPU bound and GPU isn't as busy, boost clocks can reach good levels and voltages stay sane. Though, unless my card is under 97-99% load, the results are like this. I mean, its weird but, when GPU load is low, boost clocks and voltages go high - when GPU load is high (over 95%) boost clocks and voltages go down because the card is choked by power limit.
Here is the screenshot showing the result








1080p


Image 1080p hosted in ImgBB




ibb.co


----------



## Shadowdane

delreylover said:


> welp  GPU crashed when changing resolutions. tried the voltage tho and unfortunately, zero effect. very expected since the card can't even reach 0.900 mV due to power choking and barely stays around 0.887mV....
> Will edit this post after I test on 1080p
> 
> Edit: Tested with 1080p and yes, since my performance is now CPU bound and GPU isn't as busy, boost clocks can reach good levels and voltages stay sane. Though, unless my card is under 97-99% load, the results are like this. I mean, its weird but, when GPU load is low, boost clocks and voltages go high - when GPU load is high (over 95%) boost clocks and voltages go down because the card is choked by power limit.
> Here is the screenshot showing the result
> 
> 
> 
> 
> 
> 
> 
> 
> 1080p
> 
> 
> Image 1080p hosted in ImgBB
> 
> 
> 
> 
> ibb.co


Try without the voltage slider maxed out. At least on my 2080Ti that had me always hitting power limit significantly sooner with the higher power draw & voltages. At least on my 2080Ti I'd get worse performance maxing that out due to even on stock voltage it's bouncing off the power limit. The 30 series is likely even worse in that regard with Nvidia pushing those cards soo close to the limits to start with. 

I don't have a 3080 yet to test myself haven't been able to find one in stock yet, they all disappear in seconds every time they get restocked!


----------



## VoRtAn

delreylover said:


> ... when GPU load is low, boost clocks and voltages go high - when GPU load is high (over 95%) boost clocks and voltages go down because the card is choked by power limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 1080p
> 
> 
> Image 1080p hosted in ImgBB
> 
> 
> 
> 
> ibb.co


That's normal behaviour, clocks and voltage raise together, thats how dynamic boost works.
If your load is low, it will reach higher clocks, but to get higher clocks, voltage need to go up also, when you reach the limit, clocks go down.

I don't have experience with that model/brand, dunno if it's normal gaming at 4k and having max boost below nvidia specs, seems weird to me, concerning its marketing like a "4K card", tried few days ago upscaling my 3080 and clocks are well above nvidia specs.
You now anyone else with same vga to test in same conditions ?

Can you get me a print of the voltage and temperature curve all stock in afterburner ?


----------



## delreylover

VoRtAn said:


> You now anyone else with same vga to test in same conditions ?


@VPII has the same card but the OC variant.



VoRtAn said:


> Can you get me a print of the voltage and temperature curve all stock in afterburner ?


I'd love to. But how can I do that? I have no clue...


----------



## Shadowdane

delreylover said:


> I'd love to. But how can I do that? I have no clue...


Ctrl+F opens the voltage/frequency curve editor. There is also a button on the new Afterburner skins, your using a very old skin.


----------



## VoRtAn

with afterburner opened, click on curve editor 
show me voltage and temperature graphic please.


----------



## delreylover

Shadowdane said:


> Ctrl+F opens the voltage/frequency curve editor. There is also a button on the new Afterburner skins, your using a very old skin.


Yeah hehe, got used to this skin for a long time.



Shadowdane said:


> with afterburner opened, click on curve editor
> show me voltage and temperature graphic please.


Here is the voltage/freq curve








voltage


Image voltage hosted in ImgBB




ibb.co





Here is the temperature/freq curve








temp


Image temp hosted in ImgBB




ibb.co





and changed the theme to match yours to help you understand it easier


----------



## VoRtAn

delreylover said:


> Yeah hehe, got used to this skin for a long time.
> 
> 
> Here is the voltage/freq curve
> 
> 
> 
> 
> 
> 
> 
> 
> voltage
> 
> 
> Image voltage hosted in ImgBB
> 
> 
> 
> 
> ibb.co
> 
> 
> 
> 
> 
> Here is the temperature/freq curve
> 
> 
> 
> 
> 
> 
> 
> 
> temp
> 
> 
> Image temp hosted in ImgBB
> 
> 
> 
> 
> ibb.co
> 
> 
> 
> 
> 
> and changed the theme to match yours to help you understand it easier


Your vga curve is way conservative, maybe because bios / binning / tdp.
try this

with voltage curve opened, press and don't release shift
at the dot of the curve more to your right for example press left mouse and keep it pressed and go up with all the curve (don't release shift until you release the left mouse button) until, let's say, top clock reaches somewhere 1950 +/-, let's not push it to far, just testing.

Press apply and test the same scenarios that you showed us before.


----------



## cstkl1

That TUF OC that was shared is not the performance bios.

the clocks are still at 1710

also has anybody manage to get protection reenabled??


----------



## cstkl1

hmm i think ppl should wait before flashing

tuf has a info rom

flash bios doesnt affect this but causes date timestamp mismatch

the board bios mem chip seems to have a backup function. 

hmmm gonna lay off flashing. lots of stuff are different.


----------



## Pet_gz

cstkl1 said:


> That TUF OC that was shared is not the performance bios.
> 
> the clocks are still at 1710











[Official] NVIDIA RTX 3080 Owner's Club


hi every one this is my asus rtx 3080 tuf oc performance bios enjoy https://fil.email/TlW9iNRD




www.overclock.net


----------



## cstkl1

Pet_gz said:


> [Official] NVIDIA RTX 3080 Owner's Club
> 
> 
> hi every one this is my asus rtx 3080 tuf oc performance bios enjoy https://fil.email/TlW9iNRD
> 
> 
> 
> 
> www.overclock.net


Its not. Tested it. The clocks are still 1710.
not 18xx stock

only diff was the power at 100% changed from non oc version.
Nvflash64 -6 right

also did you get protection renabled??
Did your inforom show a mismatch on timestamp??


----------



## pdlr

Hello guys. Does anyone know where I can find the bios of the Colorful adavanced or the Asus ROG strix oc? in techpowerup, only the two usual ones come out, the one of the FE and EVGA FTW3. And although the EVGA one I already have on my MSI TRIO, and it works well, you easily reach the power limit as soon as you do a strong overclok. T

thanks.


----------



## ELCID777

So is there any real benefit to flashing bios right now? I Most people are saying it doesnt change anything or just flat out doesnt work.


----------



## TK421

There's no significantly higher power limit bios out for 3080 at this time being I think.


----------



## pdlr

It works for me, I have gained stability and a total of 2% marginal improvement.

But I have an MSI TRIO GAMING X 3X8 PIN and I'm putting 3X8 Pin bios, I imagine that the 2X8 can give problems.


----------



## QuatroKiller

Hi Guys! Been holding off upgrading from my 2080TI to a 3080 because I wanted to get all the info I could from this thread before deciding what version of the 3080 to pick up. When can we expect the OP to be updated with more information in the FAQ section (like it is for the 2080 Ti thread).

I know its only been a few weeks so maybe I'm way ahead of myself. Thanks!!


----------



## Pet_gz

cstkl1 said:


> Its not. Tested it. The clocks are still 1710.
> not 18xx stock


Works perfectly, i flashed my Tuf No OC version, stocks clocks 1440 core 1785 boost.


----------



## changboy

I was following the 3090 forum but cant get a card but i just got order on the asus strix oc rtx-3080, hope i will like it


----------



## Purple_Light

changboy said:


> I was following the 3090 forum but cant get a card but i just got order on the asus strix oc rtx-3080, hope i will like it


How did you order in Canada ?


----------



## changboy

memoryexpress


----------



## djriful

Just got it yesterday in Canada (ordered on Sep 24, shipped on 28). This is the best GPU design and quality I ever seen in any computer hardware... holy moly! RTX 3080 Founders Edition weigh like a brick!


----------



## djriful

Purple_Light said:


> How did you order in Canada ?


Join Nvidia Discord, live stock update at milliseconds.


----------



## CptAsian

djriful said:


> Just got it yesterday in Canada (ordered on Sep 24, shipped on 28). This is the best GPU design and quality I ever seen in any computer hardware... holy moly! RTX 3080 Founders Edition weigh like a brick!
> 
> View attachment 2460618
> 
> View attachment 2460619
> 
> View attachment 2460620
> 
> View attachment 2460621
> 
> View attachment 2460622
> 
> View attachment 2460623
> 
> View attachment 2460624


Great photos, the box matches nicely too.


----------



## trailer park boy

well i got to ask you peoples,,whats your opinion,do ya think my 7700K would do ok with a RTX 3080 for a year and a half,,i running a 3440 x1440 monitor,im almost leaning towards the 3070,,wich might be more than enough for my monitor,,for a few years anyways,,


----------



## HyperMatrix

trailer park boy said:


> well i got to ask you peoples,,whats your opinion,do ya think my 7700K would do ok with a RTX 3080 for a year and a half,,i running a 3440 x1440 monitor,im almost leaning towards the 3070,,wich might be more than enough for my monitor,,for a few years anyways,,


Well 3440x1440 is 34% more pixels than 1440p, which is 2.4x more pixels than 1080. So you've got a decent scenario where GPU is generally going to be taxed more than the CPU, depending on the frame rate you're hoping to hit. I'd go with the 3080 over the 3070 for a few reasons. 

3070 is likely to be hot garbage based on the reduced memory and bandwidth. 3080 has 70% more memory bandwidth and 60% more cuda cores than the 3070. But it's only priced 40% more. That's a really good price/performance value increase. It also means the card will last you longer without you always looking to upgrade it. And it'll mean at your resolution, you'll be able to take better advantage of ray tracing effects. 

I wouldn't think twice about it. Go 3080. Stick to one of the cheaper $699 models like the founder's edition or asus tuf.


----------



## MrTOOSHORT

trailer park boy said:


> well i got to ask you peoples,,whats your opinion,do ya think my 7700K would do ok with a RTX 3080 for a year and a half,,i running a 3440 x1440 monitor,im almost leaning towards the 3070,,wich might be more than enough for my monitor,,for a few years anyways,,


yup


----------



## phara0h

pdlr said:


> Hello guys. Does anyone know where I can find the bios of the Colorful adavanced or the Asus ROG strix oc? in techpowerup, only the two usual ones come out, the one of the FE and EVGA FTW3. And although the EVGA one I already have on my MSI TRIO, and it works well, you easily reach the power limit as soon as you do a strong overclok. T
> 
> thanks.


Did you notice any boost clock improvement after flashing the EVGA vbios? What's the power limit now, is it properly increased to 400w? I'm looking to flash my MSI Trio with the EVGA vbios too


----------



## VPII

phara0h said:


> Did you notice any boost clock improvement after flashing the EVGA vbios? What's the power limit now, is it properly increased to 400w? I'm looking to flash my MSI Trio with the EVGA vbios too


Look I have been benchmarking this Palit RTX 3080 I got from the day I got it keeping an eye on power draw, temps and clocks all through all the benchmarks. Not in any of the benchmarks I've ran did my power consumption reach 350watt and barely go into the 340watt region as it was mostly around 320 to 335watt.

Upon flashing the Palit with the Evga FTW3 Ultra bios (yes I know it is a 3 x 8 pin power flashed onto a 2 x 8 pin power card) my power draw during the same benchmarks immediately reached 380 to 400watt. So I asked myself, clocks everything is the same, how does it all of a sudden pull 15% more power and performance is even worse.

I also decided to try the Asus Tuf OC and normal Asus Tuf bios and performance and power draw was more or less the same except that peformance was a little worse.

I then decided to try the Gigabyte Gaming OC bios, no power increase but at least it is 370watt power limit. I was surprised with the end result.

This here is with the Gigabyte Gaming OC bios


https://www.3dmark.com/spy/14267568



And this here is with the Palit's normal bios and increased power limit to 350watt
https://www.3dmark.com/spy/14267657

Clocks speeds are the same, yes I know that 3dmakr state that the clocks with the GB is lower, but it was stock and at stock you talking about 1800mhz as you'll see in their website.


----------



## Jonathon Schott

I kinda hope EK or someone does a water block for the founders edition that also has the 'v' missing like the pcb. I understand it would be hard to manufacture, but tell me I'm wrong when I say that would look amazing, or even just a clear window there. I would get the FEd just to use that block. Just wanted to put that out there.

Just realized my about my shorthand. Brings a whole new meaning to the FEd lowering interest rates.... Too bad for us the rates stayed the same.....


----------



## ELCID777

_Can anyone recommend a bios for the Zotac Trinity 3080, please? This card desperately needs a higher power limit than it current carries with stock bios. _


----------



## ChaosBlades

I would just like to remind everyone about changing BIOS to get a higher power limit. Not every cards components can handle higher power limits. Just because you flash a BIOS and "it works" does not mean it will work forever. The reason cards do not have a higher power limit out of the box isn't the company trying to make their higher end products prices justifiable. The power plane isn't designed to pump out that much power. Remember these cards are pumping out over 300 watts then gen. If you want a higher power limit the safest bet is to trade in your card for a Strix or FTW3 card when they are available.


----------



## shiokarai

ChaosBlades said:


> I would just like to remind everyone about changing BIOS to get a higher power limit. Not every cards components can handle higher power limits. Just because you flash a BIOS and "it works" does not mean it will work forever. The reason cards do not have a higher power limit out of the box isn't the company trying to make their higher end products prices justifiable. The power plane isn't designed to pump out that much power. Remember these cards are pumping out over 300 watts then gen. If you want a higher power limit the safest bet is to trade in your card for a Strix or FTW3 card when they are available.


Zotac rep explicitly said trinity is gimped because of product segmentation, so no, it's not like that.


----------



## Arni90

shiokarai said:


> Zotac rep explicitly said trinity is gimped because of product segmentation, so no, it's not like that.


It does seem to have the most cut-down PCB and VRM of all custom 3080s, and the cooling solution for the VRM looks a bit weak. Not sure if I'd bother with higher power limits on that card unless I also gave it better cooling.


----------



## shiokarai

Arni90 said:


> It does seem to have the most cut-down PCB and VRM of all custom 3080s, and the cooling solution for the VRM looks a bit weak. Not sure if I'd bother with higher power limits on that card unless I also gave it better cooling.


well, i got one water-cooled and this card really needs a better bios, clock is about 1965Mhz stable, temps about 31-32 celsius under 100% load - Doom Eternal 3840x1600 + HDR + g-sync [email protected] (ie. alphacool block + loop can dump much much more heat that card on stock bios is generating, which is about 300-320w only)


----------



## Vapochilled

trailer park boy said:


> well i got to ask you peoples,,whats your opinion,do ya think my 7700K would do ok with a RTX 3080 for a year and a half,,i running a 3440 x1440 monitor,im almost leaning towards the 3070,,wich might be more than enough for my monitor,,for a few years anyways,,


I have a 6700k and at 5k Ultra Wide 5140x1440p or 4k with all settings in ultra, 99% of the games have the same fps like 9900k (without OC).
My CPU is @4.7 and avg usage during gaming is 80 to 85%, so… i still have 15% room hehehe

If you play at 2560, then you feel the difference. Its huge. Like 80% difference between 6700k or 7700k to 9900k. But with settings in ultra and 4k or above… its like … 71fps for me... 72fps for 9900k

And i can tell you that im thinking about upgrading to 9900k using the same Z170. Pin Mod + Bios Mod is working properly


----------



## Anthraksi

So what's the current situation on BIOS flashing? Can you flash a 2x 8-pin card with a 3x 8-pin bios to get higher power limit without problems, or should you have the same amount of power connectors?


----------



## Vapochilled

I think that at the moment shunt mode would be the only power mod option ... BIOS are not really changing anything.

BTW, avoid Gigabyte Eagle OC... those pin connects .. are the most ****ty ones i've seen


----------



## shiokarai

very few people has the cards, top models aren't really out yet, too soon to say bios flashing in not viable this time around


----------



## BluePaint

Two 3090 owners reported successfully raising power limit by flashing a Strix BIOS to a MSI Trio. Maybe that will be possible for the 3080 too.
[Official] NVIDIA RTX 3090 Owner's Club


----------



## Vapochilled

BluePaint said:


> Two 3090 owners reported successfully raising power limit by flashing a Strix BIOS to a MSI Trio. Maybe that will be possible for the 3080 too.
> [Official] NVIDIA RTX 3090 Owner's Club


Thats 3pin to 3pin.... The doubt is 3x pin to 2pin 
Like, Strix to Tuf


----------



## Johneey

shiokarai said:


> well, i got one water-cooled and this card really needs a better bios, clock is about 1965Mhz stable, temps about 31-32 celsius under 100% load - Doom Eternal 3840x1600 + HDR + g-sync [email protected] (ie. alphacool block + loop can dump much much more heat that card on stock bios is generating, which is about 300-320w only)
> 
> View attachment 2460644
> View attachment 2460645


31 degreese is amazing can u do a timespy? i have 45 with the block on 3090 390 watt bios


----------



## shiokarai

Johneey said:


> 31 degreese is amazing can u do a timespy? i have 45 with the block on 3090 390 watt bios


will do, later. btw, I've got something like 4 x 560 rads for GPU only


----------



## changboy

shiokarai said:


> will do, later. btw, I've got something like 4 x 560 rads for GPU only


You mean, you have a car radiator to cool ur gpu lol.


----------



## shiokarai

changboy said:


> You mean, you have a car radiator to cool ur gpu lol.


that's what CaseLabs STH10 + 3 pedestals it like, basically (my setup)


----------



## lordzed83

Hello Guys Wont be long till i have Eagle anywya Asked around and got Bios from Gaming OC for Anyone that wants to try flash that on to EAGLE
http://wigglr.co.uk/gb3080goc.zip
@shiokarai Nielze  kiedy flash albo hard mod?? Ja to bym sobie Wallmount zrobil Chlodnice w zimie trzymam kolo nog w lecie z daleka jak karta kopie to grzania nie trzeba 
@changboy well that would by Mine setup . My loop takes almost almost 2l of coolant  This cools 450w off wall PASSIVE with very nice temps.


----------



## Mucho

shiokarai said:


> will do, later. btw, I've got something like 4 x 560 rads for GPU only


Nice, I´m going this way:





Watercool MO-RA3 360 Black Monster-Radiator, 219,95 €


Unser MO-RA3 Monsterradiator zählt zu den flexibelsten und leistungsfähigsten Radiatoren auf der Welt. TOP Verarbeitung mit Top Materialen kombiniert ergebe




shop.watercool.de





And I found a nice mod for cooling the backplate:


----------



## lordzed83

@Mucho thats why i love those forums here people got PROPER overclocking cooling solutions. Basically You will have same power as myself. That is fantastic idea is taht not M2 cooling block ??


----------



## Johneey

*** guys. U have a impressive cooling solution 🥴 I have only normal cooling









22073 timespy graphic score.
Which scores u get with this fcking nice cars cooler ?


----------



## Johneey

lordzed83 said:


> Hello Guys Wont be long till i have Eagle anywya Asked around and got Bios from Gaming OC for Anyone that wants to try flash that on to EAGLE
> http://wigglr.co.uk/gb3080goc.zip
> @shiokarai Nielze  kiedy flash albo hard mod?? Ja to bym sobie Wallmount zrobil Chlodnice w zimie trzymam kolo nog w lecie z daleka jak karta kopie to grzania nie trzeba
> @changboy well that would by Mine setup . My loop takes almost almost 2l of coolant  This cools 450w off wall PASSIVE with very nice temps.
> View attachment 2460700


This cooler so big as on my BMW nice


----------



## lordzed83

@Johneey oi its custom loop its good  very nice and clean. I'm more of an Performance>Aesthetics sort of guy But cant see my pc hidden in desk just bit of red glow from side
Thats how it looks in my cave.With full room VR and full on CLUB setup after corvid cause raves dead so got myself club equipement in


http://imgur.com/DXyHkoN


----------



## pewpewlazer

shiokarai said:


> will do, later. btw, I've got something like 4 x 560 rads for GPU only


What are your ambient temps like? Do you know your water temperature? 31*C load on the GPU is crazy good regardless of how many radiators you have. I'm over here dreaming of my 2080 Ti staying under 40c... Your card will probably take off like a rocket ship once it gets some more juice (power limit)!


----------



## shiokarai

pewpewlazer said:


> What are your ambient temps like? Do you know your water temperature? 31*C load on the GPU is crazy good regardless of how many radiators you have. I'm over here dreaming of my 2080 Ti staying under 40c... Your card will probably take off like a rocket ship once it gets some more juice (power limit)!


Ambient is 20-21 celsius, also there's 2xD5 in gpu loop going at 100%, water temp when GPU loop is saturated (after few hours of playing games): 28 celsius, GPU core: 31-33. I'm amazed myself, previously running RTX 2080ti with 380w bios mod and aquacomputer block, never saw this good temps - maybe different block construction (it's quite restrictive), lower power req. (320w vs 380w). Also PC is in the separate room/place.

btw, now testing OC 120 core + 500 mem, after 6-8h more with Doom Eternal - perfectly stable, clock hovering around 2000-2050 MHz (of course ****ty PL holds card back)

@lordzed83 z flashem czekam na jakieś konkretne info czy to aby bezpieczne, póki co gram


----------



## Mucho

lordzed83 said:


> @Mucho thats why i love those forums here people got PROPER overclocking cooling solutions. Basically You will have same power as myself. That is fantastic idea is taht not M2 cooling block ??


Ram Cooler by Alphacool


https://www.alphacool.com/shop/ram-cooler/19803/alphacool-d-ram-cooler-x4-universal-plexi-black-nickel



Or this one with RGB








35.2US $ |Bykski Acryl RAM Wasser Block Unterstützung 2 Und 4 Kanäle Speicher Kupfer Gekühlt Kopf RGB RBW ,B RAM D4 X|Lüfter & Kühlung| - AliExpress


Smarter Shopping, Better Living! Aliexpress.com




de.aliexpress.com


----------



## shiokarai

Mucho said:


> Nice, I´m going this way:
> 
> 
> 
> 
> 
> Watercool MO-RA3 360 Black Monster-Radiator, 219,95 €
> 
> 
> Unser MO-RA3 Monsterradiator zählt zu den flexibelsten und leistungsfähigsten Radiatoren auf der Welt. TOP Verarbeitung mit Top Materialen kombiniert ergebe
> 
> 
> 
> 
> shop.watercool.de
> 
> 
> 
> 
> 
> And I found a nice mod for cooling the backplate:
> View attachment 2460702


I wonder how much does it help with the 3080? I'd imagine with the 3090 it would really help, but with 3080? Also, was it hard to mod the backplate? Super cool mod


----------



## Mucho

shiokarai said:


> I wonder how much does it help with the 3080? I'd imagine with the 3090 it would really help, but with 3080? Also, was it hard to mod the backplate? Super cool mod


It´s not my mod. A guy from a german forum made it. 4 holes, 4 threads, 6mm screws and some paste

53° C befor mod, 38° after mod


----------



## mrv153

Good result for a FTW3 ultra gaming?


https://www.3dmark.com/spy/14299578



Feels a bit low :/
140 clock 400 mem


----------



## HyperMatrix

lordzed83 said:


> @shiokarai Nielze  kiedy flash albo hard mod?? Ja to bym sobie Wallmount zrobil Chlodnice w zimie trzymam kolo nog w lecie z daleka jak karta kopie to grzania nie trzeba


English only please. It’s part of the forum rules. For all we know, you could be sharing secrets to unlocking +100MHz extra OC for free.


----------



## Daepilin

I'm pretty sure we still have no clue about Strix Non oc vs oc in terms of power limit?

Some Strix oc reviews slowly seem to pop up, indicating 440-450w, but it's quite annoying there is no comparison...

I ordered a non oc out of stupidity (18.9)and now switching my pre order to an oc (ordered 28.9, they claimed 1.5k orders I think) would probably mean a wait until late November/December...

Additionally gaming Performance will only be very minimally different...

I need some advice I think


----------



## HyperMatrix

Daepilin said:


> I'm pretty sure we still have no clue about Strix Non oc vs oc in terms of power limit?
> 
> Some Strix oc reviews slowly seem to pop up, indicating 440-450w, but it's quite annoying there is no comparison...
> 
> I ordered a non oc out of stupidity (18.9)and now switching my pre order to an oc (ordered 28.9, they claimed 1.5k orders I think) would probably mean a wait until late November/December...
> 
> Additionally gaming Performance will only be very minimally different...
> 
> I need some advice I think


Unless there is some form of binning being done, which we haven't seen yet, OC vs. Non-OC is nothing a bios flash can't fix.


----------



## ELCID777

So has anyone flashed their Zotac Trinity 3080 yet?? I am waiting to see if anyone has been able to flash to a different bios and remove that ridiculous power limit Zotac intentionally gimped the card with.


----------



## Chuckclc

Was able to snag up the EVGA FTW RTX 3080 on New Egg tonight. Question, does anyone run these on a 650W PSU? I have an old 750W Antec True Power in another system, but my main system I use a newer EVGA 650W Super Nova G+. Think that will be enough? Coupled with a R5 3600, no crazy power suckers anywhere else on my build.


----------



## Mucho

ELCID777 said:


> So has anyone flashed their Zotac Trinity 3080 yet?? I am waiting to see if anyone has been able to flash to a different bios and remove that ridiculous power limit Zotac intentionally gimped the card with.


You could try and flash the Palit bios with 350W PL


----------



## VPII

Mucho said:


> You could try and flash the Palit bios with 350W PL


I was going to suggest the same. This power limit is actually a real pain and flashing the bios does not seem to help, in most cases you lose performance at same clocks.


----------



## VPII

Interestingly, you go and search all of the RTX 3080 submissions in 3D Mark for Time Spy, ignoring the first one as it was done with LN2 on a Nvidia Founders Edition, but the next one were from what I can see in the Graphic card description done with Asus, then Evga and then Colorful.

Now looking at the Asus submission it had a 14mhz drop from Max clock to average clock.

The Evga submission had a 80mhz drop from Max clock to average clock.

And the Colorful seem to have run lower than stock with a 35mhz drop from Max clocks to average clocks.

From this it seems as though the Asus has the best power limit as you can see in the GPU temps that the cards were running with stock air coolers taken that it was at 53 to 54c max load temp.


----------



## Talon2016

Grabbed an RTX 3080 FTW3 Ultra this morning at Microcenter. I decided messing with the vBIOS this round doesn't seem to be working. I'm sure it will get figured out eventually, but for now I no longer have to worry. This card is beast mode and I've seen it pull as high as 415w under heavy load. Runs 1950-2000Mhz out of box stock and holds high boost under load for my 4K 144hz gaming.


----------



## VPII

Talon2016 said:


> Grabbed an RTX 3080 FTW3 Ultra this morning at Microcenter. I decided messing with the vBIOS this round doesn't seem to be working. I'm sure it will get figured out eventually, but for now I no longer have to worry. This card is beast mode and I've seen it pull as high as 415w under heavy load. Runs 1950-2000Mhz out of box stock and holds high boost under load for my 4K 144hz gaming.


Do me a favour, run 3dmark Time Spy with the power slider maxed out and then post the result and let me know what the max power draw was. I am asking out of interest as the max power draw I see during time spy with my card clocked up to 2220mhz core is only 334 to 335watt. Yes I clock it that high as I know with my power limit it would never run those clocks but at least the drops in clocks would still be in the 2000mhz range dropping to high 1900 range.


----------



## KenjiS

I lucked out and managed to get an FTW3 3080 from newegg earlier  

Looking forward to when it gets here so i can see how far i can push it


----------



## Talon2016

VPII said:


> Do me a favour, run 3dmark Time Spy with the power slider maxed out and then post the result and let me know what the max power draw was. I am asking out of interest as the max power draw I see during time spy with my card clocked up to 2220mhz core is only 334 to 335watt. Yes I clock it that high as I know with my power limit it would never run those clocks but at least the drops in clocks would still be in the 2000mhz range dropping to high 1900 range.


Max power draw was 410w with 390-400w being average


----------



## Daepilin

HyperMatrix said:


> Unless there is some form of binning being done, which we haven't seen yet, OC vs. Non-OC is nothing a bios flash can't fix.


Hm, ruins the warranty of course... I will have to think about it


----------



## jexux

mrv153 said:


> Good result for a FTW3 ultra gaming?
> 
> 
> https://www.3dmark.com/spy/14299578
> 
> 
> 
> Feels a bit low :/
> 140 clock 400 mem


I think so. This score is more or less than NO OC version. Try to curve OC and +600 mem.
The mems are samsung? Or Micron.
Thanks.


----------



## VPII

Talon2016 said:


> Max power draw was 410w with 390-400w being average


You see this is what I cannot understand. My card at stock with max boost of 2055mhz would pull 333 maybe 335watt. If I overclock the card to 2220mhz it would pass Time Spy and still pull only 335 to 336watt but average clocks would be like 1981mhz. Now I found this reneding benchmark to run with the RTX 3080 and I have to say when looking at the max temps it would be into the 56c but with fans running 100%. Now running Time Spy with fans running 100% you'll see at most 53c but mostly 52c.

Funny thing is with this Octanebench 2020.1 you'll see power draw going up to 348watt, my max allowable it 350watt, but clocks would remain way above 2000mhz, lowest at stock 2055mhz would be 2025mhz. Now I have run my card all the way up to 2190mhz and it passed the Octanebench without an issue. First pic is running stock and the







second with 2190mhz core.


----------



## VPII

Okay first pick was 2190core and second stock at 2055mhz core


----------



## HyperMatrix

jexux said:


> I think so. This score is more or less than NO OC version. Try to curve OC and +600 mem.
> The mems are samsung? Or Micron.
> Thanks.


Only Micron makes PAM4 GDDR6x. The 3070 uses normal GDDR6 so it can be sourced from anyone. But all 3080/3090 cards use Micron memory.


----------



## Nizzen

Mucho said:


> Nice, I´m going this way:
> 
> 
> 
> 
> 
> Watercool MO-RA3 360 Black Monster-Radiator, 219,95 €
> 
> 
> Unser MO-RA3 Monsterradiator zählt zu den flexibelsten und leistungsfähigsten Radiatoren auf der Welt. TOP Verarbeitung mit Top Materialen kombiniert ergebe
> 
> 
> 
> 
> shop.watercool.de
> 
> 
> 
> 
> 
> And I found a nice mod for cooling the backplate:
> View attachment 2460702


Do you have the source of this mod?
One of the best coolingmods I ever seen


----------



## acoustic

Talon2016 said:


> Grabbed an RTX 3080 FTW3 Ultra this morning at Microcenter. I decided messing with the vBIOS this round doesn't seem to be working. I'm sure it will get figured out eventually, but for now I no longer have to worry. This card is beast mode and I've seen it pull as high as 415w under heavy load. Runs 1950-2000Mhz out of box stock and holds high boost under load for my 4K 144hz gaming.


How'd you know Microcenter would have stock? I have one 50min away but not going to stand in a line for 4 hours for a video-card.


----------



## Chrisch

Hello,

is here maybe anyone who can upload a ASUS 3080 TUF Bios or other with higher Powerlimit?

i have a MSI 3080 Ventus 3X OC with only 320W and i tried a Gigabyte (370W) and EVGA (400W) BIOS but both dont help.

Regards
Chris


----------



## KenjiS

jexux said:


> I think so. This score is more or less than NO OC version. Try to curve OC and +600 mem.
> The mems are samsung? Or Micron.
> Thanks.



Actually maybe try to pull the mem back, the new GDDR6X memory has error correction in it, in short, instead of artifacting when you go beyond what it can handle it just slows down and you lose performance. 

Other thought to me is that Time Spy is 1440p and even at 1440p the 3080 can be a tidge CPU bottlenecked. your CPU is only at 4.8 is there any going higher on it? Maybe try Time Spy Extreme and see where you land


----------



## Mucho

Nizzen said:


> Do you have the source of this mod?
> One of the best coolingmods I ever seen


Nothing offical. He only posted the photo and that he is using an Alphacool Ram-Cooler.


----------



## jexux

KenjiS said:


> Actually maybe try to pull the mem back, the new GDDR6X memory has error correction in it, in short, instead of artifacting when you go beyond what it can handle it just slows down and you lose performance.
> 
> Other thought to me is that Time Spy is 1440p and even at 1440p the 3080 can be a tidge CPU bottlenecked. your CPU is only at 4.8 is there any going higher on it? Maybe try Time Spy Extreme and see where you land


In the Gigabyte Eagle that I have, if I go over 600, the score does not improve, but up to 600 it does. +400 seems little to me in the ULTRA. Supposedly it has everything better than the Eagle. Regarding the oc of the mic, I have tested my 9900k at 4.9 and 5.1 and the gpu score is no better. Just go up the CPU score.


----------



## sblantipodi

VPII said:


> You see this is what I cannot understand. My card at stock with max boost of 2055mhz would pull 333 maybe 335watt. If I overclock the card to 2220mhz it would pass Time Spy and still pull only 335 to 336watt but average clocks would be like 1981mhz. Now I found this reneding benchmark to run with the RTX 3080 and I have to say when looking at the max temps it would be into the 56c but with fans running 100%. Now running Time Spy with fans running 100% you'll see at most 53c but mostly 52c.
> 
> Funny thing is with this Octanebench 2020.1 you'll see power draw going up to 348watt, my max allowable it 350watt, but clocks would remain way above 2000mhz, lowest at stock 2055mhz would be 2025mhz. Now I have run my card all the way up to 2190mhz and it passed the Octanebench without an issue. First pic is running stock and the
> View attachment 2460753
> second with 2190mhz core.
> 
> View attachment 2460751


wonderful skin, what skin are you using on MSI Afterburner?


----------



## VPII

sblantipodi said:


> wonderful skin, what skin are you using on MSI Afterburner?


The one it came with, not sure actually


----------



## 6u4rdi4n

sblantipodi said:


> wonderful skin, what skin are you using on MSI Afterburner?


That is the MSI Mystic skin


----------



## Anth0789

Just preordered the Gigabyte Aorus - Should get it by Mid October, Can't wait to get it!


----------



## Purple_Light

Chrisch said:


> Hello,
> 
> is here maybe anyone who can upload a ASUS 3080 TUF Bios or other with higher Powerlimit?
> 
> i have a MSI 3080 Ventus 3X OC with only 320W and i tried a Gigabyte (370W) and EVGA (400W) BIOS but both dont help.
> 
> Regards
> Chris


It is already in the previous posts, look around page 30 and up.

Would be nice to put the list of posted bios on the 1st page.


----------



## Chrisch

Thanks, found them. Tried TUF, TUF OC, Gigabyte OC and EVGA Bios on my MSI Ventus 3X OC but Powerlimit is still 320W 

Maybe its a hardware related thing.


----------



## Mucho

Chrisch said:


> Thanks, found them. Tried TUF, TUF OC, Gigabyte OC and EVGA Bios on my MSI Ventus 3X OC but Powerlimit is still 320W
> 
> Maybe its a hardware related thing.


Try the Palit OC bios with 350W PL


----------



## Vapochilled

I used the gigsbyte gaming oc bios on my gigabyte eagle Oc.. the I only got 10w of improvement


----------



## KenjiS

Id say a list of alternate bios downloads and maybe some kind of 3dmark score list to help people dial in their cards? Idk...


----------



## finalheaven

Joining the club with an Nvidia RTX 3080. Great upgrade from 1070.


----------



## Talon2016

acoustic said:


> How'd you know Microcenter would have stock? I have one 50min away but not going to stand in a line for 4 hours for a video-card.


I had spoken to an employee in the store a few days prior and was told they've been getting stock almost daily. They don't post it online, so it always shows SOLD OUT. But they get UPS and FedEx trucks prior to opening or shortly after opening with stock, again almost daily. They seem to be getting emailed in the morning with a tracking # and shipping contents. When I got there they got 2 FedEx trucks with Gigabyte OC 3080s. Then I was told FTW3 Ultras were on a UPS truck due for delivery. Me and 7 others waited and all got cards as the box has 8 cards in it. 

I got there around 930am, about 10th in line. So you def don't need to get there hours early. I was out of the store around 11am with my FTW3 Ultra in hand.


----------



## Chrisch

Mucho said:


> Try the Palit OC bios with 350W PL


tried. first bios that doesnt work. with this i cant install the grafic driver.


----------



## VoRtAn

3080 bios from Evga FTW Ultra works perfectly on MSI Gaming X Trio, i don't expect much difference on air cooling daily.


----------



## acoustic

Talon2016 said:


> I had spoken to an employee in the store a few days prior and was told they've been getting stock almost daily. They don't post it online, so it always shows SOLD OUT. But they get UPS and FedEx trucks prior to opening or shortly after opening with stock, again almost daily. They seem to be getting emailed in the morning with a tracking # and shipping contents. When I got there they got 2 FedEx trucks with Gigabyte OC 3080s. Then I was told FTW3 Ultras were on a UPS truck due for delivery. Me and 7 others waited and all got cards as the box has 8 cards in it.
> 
> I got there around 930am, about 10th in line. So you def don't need to get there hours early. I was out of the store around 11am with my FTW3 Ultra in hand.


Thanks a bunch buddy. Where are you located? I'm going to try the St. David's, PA Microcenter tomorrow.


----------



## Talon2016

VoRtAn said:


> 3080 bios from Evga FTW Ultra works perfectly on MSI Gaming X Trio, i don't expect much difference on air cooling daily.


Really? Run a stock clock max power slider on both runs and check score. The vBIOS will report it's pulling more power on the 2x8pin cards but it's not real. See if you actually increase performance. On the 2x8pin it was leading to lower clocks and lower performance. Curious to see what the perf uplift is on the Trio with the FTW3 bios. Do we know what hte Strix power limit is? I am curious to try that on my FTW3.



acoustic said:


> Thanks a bunch buddy. Where are you located? I'm going to try the St. David's, PA Microcenter tomorrow.


Westmont store.


----------



## shiokarai

Nizzen said:


> Do you have the source of this mod?
> One of the best coolingmods I ever seen


Decided to try this myself  Ordered some EK Monarch RAM coolers, will try to put 2 of them on the backplate (should fit), along with the Gelid GP-Extreme thermal pads (replacing stock thermal pads). Also will try conductonaut on the GPU core. Fun times ahead  This should keep me busy until BIOS situation is figured/better cards are out. Also, should I upgrade to 3090 this mod will be really handy.


----------



## Zeakie

Any news on people flashing the trinity oc mines arriving in 2 days would love to know what's the best way to go bout it


----------



## lordzed83

Vapochilled said:


> I used the gigsbyte gaming oc bios on my gigabyte eagle Oc.. the I only got 10w of improvement


Guiess its shunt mod then


----------



## Vapochilled

lordzed83 said:


> Guiess its shunt mod then


But it's not really worth it. You will gain 3% when compared to my current custom curve ... For me.. at 5k, that's like 2fps or so...


----------



## lordzed83

Vapochilled said:


> But it's not really worth it. You will gain 3% when compared to my current custom curve ... For me.. at 5k, that's like 2fps or so...


Well with mod you are gainin not 10 but more like 150w extra id say under water that should let the chip fly. It's few minute job anyway when stripping card to get block on so why not. Ill just do it cause I can Power section looks solid with my loop heat output and power draw in my case is irrelevant. Common Its overclockers forum even my smartphone is reflashed and overclodked. Do i need it no.. BUT I NEED IT. to this day im trying to get better timings on my Zen even 1 third type timing down would be good. And I'w invested good 50 hours on tweaking timings so far even today spent 3 hours reflashing 3 bioses looking if i can get timings lower


----------



## KS81

Is there any information regarding which BIOS is safe to flash to the different cards? I would love to try a BIOS with a higher power limit, i.e TUF OC?

Palit RTX 3080 GamingPro OC /w Alphacool Eisblock


----------



## Mucho

KS81 said:


> Is there any information regarding which BIOS is safe to flash to the different cards? I would love to try a BIOS with a higher power limit, i.e TUF OC?
> 
> Palit RTX 3080 GamingPro OC /w Alphacool Eisblock


Bios with higher PL aren´t working right now


----------



## maghabi

VoRtAn said:


> 3080 bios from Evga FTW Ultra works perfectly on MSI Gaming X Trio, i don't expect much difference on air cooling daily.


I just flashed the EVGA bios yesterday on my MSI Gaming X Trio. Do you think there is any risk to the card? VRM issues etc. Average clocks have definitely improved.


----------



## KS81

Mucho said:


> Bios with higher PL aren´t working right now


Ok. Thanks.


----------



## VoRtAn

maghabi said:


> I just flashed the EVGA bios yesterday on my MSI Gaming X Trio. Do you think there is any risk to the card? VRM issues etc. Average clocks have definitely improved.


No issue whatsoever.
Evga fan curve bios is more agressive so it's not fair compare both at stock fan curve. I've tested raytracing benchmark and final score is the same with both cards, didn't tried 3dmark yet.
Because fans are diferents in each card you need to check the rpm to be equal, evga 73% should be 60% on msi +/- 1850rpm both with that %, is what i use for gaming on both, for me that noise is OK, above is too much... use that speed on each and compare.
Other curious thing that i've realised between both is that power division through the 3 power connectors is more equally divided on each one on the evga bios.


----------



## sblantipodi

Guys, I would like to upgrade my 2080Ti for a 3080 20GB. How long should I wait? What is the general consensous?
Christamas? Next summer? How long?


----------



## Purple_Light

sblantipodi said:


> Guys, I would like to upgrade my 2080Ti for a 3080 20GB. How long should I wait? What is the general consensous?
> Christamas? Next summer? How long?


After amd's response to 3080.


----------



## pewpewlazer

sblantipodi said:


> Guys, I would like to upgrade my 2080Ti for a 3080 20GB. How long should I wait? What is the general consensous?
> Christamas? Next summer? How long?


My crystal ball says the 20gb 3080s will come out within the next year.


----------



## VPII

It is actually frustrating, if I run 3dmark time spy, port royal or so I'll see max power draw with MSI After burner around 336watt but clocks would drop around 320watt even with the power limit increased to 350watt. Now if I run Vray or Octanebench it would draw up to 349 to 351watt, clearly visible in the temps rising by 5c compared, but the clocks would remain within 30 maybe 45mhz of the max clock. If I take the highest clocks I passed octanebench wich was +135mh for 2190mhz max clock, it would go at the lowes 2130mhz but would be for 70% or so of the benchmark around 2190mhz. WHY?


----------



## lordzed83

sblantipodi said:


> Guys, I would like to upgrade my 2080Ti for a 3080 20GB. How long should I wait? What is the general consensous?
> Christamas? Next summer? How long?





https://i.ibb.co/8YN1bnz/20190813-211547-HDR-1.jpg










In next 12 months


----------



## MrBridgeSix

Just received my RTX 3080 Eagle, will flashing the Gaming OC BIOS unlock a higher PL?


----------



## Mucho

Finally its done. Bykski Ram-Cooler with RGB. Bykski has its own 5V RGB connectors, so I had to change the connector to a JST connector and connected it to the Y-cable of the Alphacool waterblock. Going to test temps now.


----------



## shALKE

Mucho said:


> Finally its done. Bykski Ram-Cooler with RGB. Bykski has its own 5V RGB connectors, so I had to change the connector to a JST connector and connected it to the Y-cable of the Alphacool waterblock. Going to test temps now.
> 
> View attachment 2461035


Do you use any interface between the block and the backplate?


----------



## Mucho

shALKE said:


> Do you use any interface between the block and the backplate?


Only thermal paste.


----------



## Dalekdoc

Mucho said:


> Finally its done. Bykski Ram-Cooler with RGB. Bykski has its own 5V RGB connectors, so I had to change the connector to a JST connector and connected it to the Y-cable of the Alphacool waterblock. Going to test temps now.
> 
> View attachment 2461035


How are the temperature results?


----------



## Dalekdoc

Can anyone help me extract the actual bios file from the Palit 3080 GamingPro OC Bios update tool located here - https://www.palit.com/palit/vgapro.php?id=3763&lang=en&pn=NED3080S19IA-132AA&tab=do

Alternatively, if there is a Palit 3080 user here who has already applied that update, could you please extract your bios and upload it for me to use?

Since this is a reference card, I wanted to flash it on to my Zotac 3080 Trinity OC

Thanks.


----------



## olrdtg

Mucho said:


> Finally its done. Bykski Ram-Cooler with RGB. Bykski has its own 5V RGB connectors, so I had to change the connector to a JST connector and connected it to the Y-cable of the Alphacool waterblock. Going to test temps now.
> 
> View attachment 2461035


Aren't their RAM blocks made of aluminum? You worried about any corrosion?


----------



## outofmyheadyo

Anyone know the PL on strix 3080?


----------



## shiokarai

Mucho said:


> Finally its done. Bykski Ram-Cooler with RGB. Bykski has its own 5V RGB connectors, so I had to change the connector to a JST connector and connected it to the Y-cable of the Alphacool waterblock. Going to test temps now.
> 
> View attachment 2461035


How did you mount it?


----------



## Mucho

Dalekdoc said:


> Can anyone help me extract the actual bios file from the Palit 3080 GamingPro OC Bios update tool located here - https://www.palit.com/palit/vgapro.php?id=3763&lang=en&pn=NED3080S19IA-132AA&tab=do
> 
> Alternatively, if there is a Palit 3080 user here who has already applied that update, could you please extract your bios and upload it for me to use?
> 
> Since this is a reference card, I wanted to flash it on to my Zotac 3080 Trinity OC
> 
> Thanks.








File-Upload.net - Datei nicht gefunden


Leider konnte Ihre hochgeladene Datei nicht gefunden werden. Laden Sie die Datei neu hoch.



www.file-upload.net




Palit OC bios updated



olrdtg said:


> Aren't their RAM blocks made of aluminum? You worried about any corrosion?


No, it´s copper, nickel-plated.








Bykski Four Channel Memory Water Block - Symphony Edition w/ 5v Addressable RGB (RBW) (B-RAM-D4-X)


The Bykski Four Channel Memory Water Block - Symphony Edition when used in conjunction with the Bykski Heat Spreader can effectively move heat away from your memory. The water block can accommodate up to 4 spreaders. Features Finest Craftmanship Designed to Fit up to 4x heat spreaders Aluminum...




www.bykski.us










Amazon.com: Bykski B-RAM-D4-X Acrylic Memory Water Cooling Head Supports Four-Channel and Dual-Channel 5V Symphony Version (5V): Computers & Accessories


Buy Bykski B-RAM-D4-X Acrylic Memory Water Cooling Head Supports Four-Channel and Dual-Channel 5V Symphony Version (5V): Water Cooling Systems - Amazon.com ✓ FREE DELIVERY possible on eligible purchases



www.amazon.com







shiokarai said:


> How did you mount it?


4 holes, 4 screws

30min Heaven bench, temps about 5° - 6° C better, but still testing.


----------



## Mucho

If somebody wants to try out, here the Inno3D 3080 X3 Bios, PL should be at 340W.






File-Upload.net - Datei nicht gefunden


Leider konnte Ihre hochgeladene Datei nicht gefunden werden. Laden Sie die Datei neu hoch.



www.file-upload.net


----------



## Sparkster

So I flashed my gaming x trio with the EVGA ultra bios and it worked fine. The power limit was at 400watts, and that's what it used in a quick play of the Witcher 3. My only issue is that I can no longer control the LED's on the card. Mystic light no longer detects the card, so I tried LED sync from EVGA but that does not work either. Is there anyway to control the LED's? Or if I turn them off before flashing the bios would they stay off? Thanks


----------



## DStealth

Flashing BIOS from other vendors have zero effect on my Palit 3080 nonOC version.
All GB,Tuf,PalitOC and EVGA are hitting [email protected] spikes. And results are very similar. Just EVGA has "higher" consumption seen in measuring tools but not real probably 3*8pin related for cards with only 2 pins.
GB OC BIOS








Asus Tuf BIOS








PalitOC BIOS








stock BIOS








Best result is with PalitOC BIOS as fans are spinning higher 3500vs2850 for other ones...


----------



## VPII

DStealth said:


> Flashing BIOS from other vendors have zero effect on my Palit 3080 nonOC version.
> All GB,Tuf,PalitOC and EVGA are hitting [email protected] spikes. And results are very similar. Just EVGA has "higher" consumption seen in measuring tools but not real probably 3*8pin related for cards with only 2 pins.
> GB OC BIOS
> View attachment 2461082
> 
> Asus Tuf BIOS
> View attachment 2461085
> 
> PalitOC BIOS
> View attachment 2461086
> 
> stock BIOS
> View attachment 2461084
> 
> Best result is with PalitOC BIOS as fans are spinning higher 3500vs2850 for other ones...


I've experienced exactly the same, it is frustrating to say the least. Regardless which bios used, clocks would start dropping the moment you reach 320watt power consumption. The funny thing is, you run VRAY or Octanebench for GPU and you go and see the power consumption and heat generated but look at the clock speeds. Seriously, it uses more power, generate more heat but your gpu clocks only drop slightly and then rise back to max clocks.


----------



## KS81

Dalekdoc said:


> Can anyone help me extract the actual bios file from the Palit 3080 GamingPro OC Bios update tool located here - https://www.palit.com/palit/vgapro.php?id=3763&lang=en&pn=NED3080S19IA-132AA&tab=do
> 
> Alternatively, if there is a Palit 3080 user here who has already applied that update, could you please extract your bios and upload it for me to use?
> 
> Since this is a reference card, I wanted to flash it on to my Zotac 3080 Trinity OC
> 
> Thanks.


I have that .rom file. How would I send it to you?


----------



## Dalekdoc

KS81 said:


> I have that .rom file. How would I send it to you?


Could you upload it to a file sharing service and post link here. Alternatively, upload to the google form, one of the users posted here - [Official] NVIDIA RTX 3080 Owner's Club
Thanks


----------



## hemon

3080 TUF: Hardwareluxx-Review Bios edition (#78)

ASUS-TUF-3080-P-BIOS-9402264067.rom
ASUS-TUF-3080-Q-BIOS-9402264068.rom


----------



## mrv153

Are Stric OC BIOS already available?


----------



## pdlr

The same I have been waiting for days ... Let's see if someone can upload that bios


----------



## Mucho

KS81 said:


> I have that .rom file. How would I send it to you?


I´ve already uploaded the Palit OC Bios with the update








[Official] NVIDIA RTX 3080 Owner's Club


Can anyone help me extract the actual bios file from the Palit 3080 GamingPro OC Bios update tool located here - https://www.palit.com/palit/vgapro.php?id=3763&lang=en&pn=NED3080S19IA-132AA&tab=do Alternatively, if there is a Palit 3080 user here who has already applied that update, could you...




www.overclock.net


----------



## Gunnutzz467

Is there any potential risk in running the FTW bios on my gaming trio? I’m seeing 430W max, around 370W with stock bios. I know it’s 20 power phases vs 16. Does that only effect overclocks or will this shorten the life of the card.


----------



## Talon2016

Will go try the Strix 450w vBIOS on my FTW3.


----------



## Avacado

***... People will do just about anything to huff Jensens leather.


----------



## Talon2016

https://www.3dmark.com/3dm/51349539


-- 19,680 GPU in Time Spy.

Asus Strix OC vBIOS just made the FTW3 3080 the card to get IMO. With the Asus vBIOS it is basically shunt modding itself and hugely under reporting power draw and the card holds crazy high boost now. I just managed to score the #1 spot in the US with this vBIOS on my FTW3 Ultra. Reported power draw under max overclock and max fans was around 330w. It's under reporting but the card performance is still scaling and the clocks are boosting to over 2100Mhz and holding near 2100Mhz the entire TimeSpy run. Not quite sure of how or why it's doing this yet, but its working and working well .

Fans run at same 3000rpm max so no issues there. One DP was deactivated, but haven't tried HDMI. Works for me as I use 2 DP ports for my monitors and they work.

Asus Strix OC 3080 vBIOS shared by a nice new owner over at reddit. They did us a solid!

*Asus Strix OC 3080 vBIOS*








File on MEGA







mega.nz


----------



## pdlr

Talon2016 said:


> https://www.3dmark.com/3dm/51348950


Bios Asus Strix ?

Where download?


----------



## Talon2016

pdlr said:


> Bios Asus Strix ?
> 
> Where download?


I edited the original post. Link added.


----------



## pdlr

Excuse me, but I go to the first page and I can't find it.

Neither in the techpowerup bios nor the ones in google drive ..

Can you pass me the direct link?

Thank you and I'm sorry.


----------



## Gunnutzz467

Talon2016 said:


> https://www.3dmark.com/3dm/51349539
> 
> 
> -- 19,680 GPU in Time Spy.
> 
> Asus Strix OC vBIOS just made the FTW3 3080 the card to get IMO. With the Asus vBIOS it is basically shunt modding itself and hugely under reporting power draw and the card holds crazy high boost now. I just managed to score the #1 spot in the US with this vBIOS on my FTW3 Ultra. Reported power draw under max overclock and max fans was around 330w. It's under reporting but the card performance is still scaling and the clocks are boosting to over 2100Mhz and holding near 2100Mhz the entire TimeSpy run. Not quite sure of how or why it's doing this yet, but its working and working well .
> 
> Fans run at same 3000rpm max so no issues there. One DP was deactivated, but haven't tried HDMI. Works for me as I use 2 DP ports for my monitors and they work.
> 
> Asus Strix OC 3080 vBIOS shared by a nice new owner over at reddit. They did us a solid!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> File on MEGA
> 
> 
> 
> 
> 
> 
> 
> mega.nz


Try port royal


----------



## Talon2016

pdlr said:


> Excuse me, but I go to the first page and I can't find it.
> 
> Neither in the techpowerup bios nor the ones in google drive ..
> 
> Can you pass me the direct link?
> 
> Thank you and I'm sorry.


No I meant just look up one or two messages and you will see direct link.


----------



## Gunnutzz467

pdlr said:


> Excuse me, but I go to the first page and I can't find it.
> 
> Neither in the techpowerup bios nor the ones in google drive ..
> 
> Can you pass me the direct link?
> 
> Thank you and I'm sorry.


Bottom of his post, it’s an attachment


----------



## OleMortenF

Talon2016 said:


> Will go try the Strix 450w vBIOS on my FTW3.


Did it work? I was gonna buy the FTW3 but I am considering the Strix instead since it has higher power limit.


----------



## pdlr

Gunnutzz467 said:


> Bottom of his post, it’s an attachment


Ah...oK thanks!


----------



## Talon2016

OleMortenF said:


> Did it work? I was gonna buy the FTW3 but I am considering the Strix instead since it has higher power limit.


Yes it works.



Gunnutzz467 said:


> Try port royal





https://www.3dmark.com/3dm/51351206


----------



## spajdr

MrBridgeSix said:


> Just received my RTX 3080 Eagle, will flashing the Gaming OC BIOS unlock a higher PL?


Hi mate, did you try some flashing already on Eagle? As I'm getting card soon too.


----------



## Vapochilled

spajdr said:


> Hi mate, did you try some flashing already on Eagle? As I'm getting card soon too.


I already did this....
10w gain... If that much....
I don't see the expected 370w


----------



## zhrooms

DStealth said:


> Flashing BIOS from other vendors have zero effect on my Palit 3080 nonOC version.
> Best result is with PalitOC BIOS as fans are spinning higher 3500vs2850 for other ones...


Very interesting, this definitely needs to be investigated, sadly I can't deep dive into it myself since I decided not to get a 3080, possibly not 3090 either, right now I'm tempted to wait for 3080 20GB variant.

We had a few 2080 Ti cards that had power limits enforced on a deeper level, didn't matter what you flashed, it simply wouldn't exceed it. I recall it happening to multiple Ventus and Lightning users, in other words MSI, what you're describing sounds similar but it's on Palit. Again, definitely needs to be looked into more. Great that you tried them all though!


----------



## Celeras

Came here to update the sig rig and join the club... but geez don't even recognize this place anymore!


----------



## pdlr

This BIOS is a must have on an MSI TRIO GAMING X, on a mere 9700K with 16GB RAM ... high scoring graphics and great average ... Thanks Talon2016 ..



https://www.3dmark.com/3dm/51355499?





https://www.3dmark.com/3dm/51356438?


----------



## Talon2016

Gunnutzz467 said:


> Try port royal





pdlr said:


> This BIOS is a must have on an MSI TRIO GAMING X, on a mere 9700K with 16GB RAM ... high scoring graphics and great average ... Thanks Talon2016 ..
> 
> 
> 
> https://www.3dmark.com/3dm/51355499?


Nice score! Can you tell me what your card is reporting for power? Is your cable #3 also not reporting power draw? This vBIOS is insane.

Ahh just noticed you're on the hotfix driver. I wonder if that driver has performance improvements. I would go try that but it's not approved driver.


----------



## pdlr

Talon2016 said:


> Nice score! Can you tell me what your card is reporting for power? Is your cable #3 also not reporting power draw? This vBIOS is insane.
> 
> Ahh just noticed you're on the hotfix driver. I wonder if that driver has performance improvements. I would go try that but it's not approved driver.


Yes,i Have HOT FIX driver,i have problems with black screens in any games..


I see everything correct ... report everything, as before, both with the original MSi bios, and with the EVGA.


----------



## Talon2016

pdlr said:


> Yes,i Have HOT FIX driver,i have problems with black screens in any games..
> 
> 
> I see everything correct ... report everything, as before, both with the original MSi bios, and with the EVGA.


Nice! Really excellent silicon quality you got there! 2175Mhz peak, hot damn.


----------



## spajdr

Vapochilled said:


> I already did this....
> 10w gain... If that much....
> I don't see the expected 370w


Thanks for the info


----------



## pdlr

Talon2016 said:


> Nice! Really excellent silicon quality you got there! 2175Mhz peak, hot damn.



It really can handle more ... even with peaks of 2190MHZ ... but that, peaks ... and it is more stable for me to pass it like that. When I put the alphacool water block in it, I think 2190MHZ will be a reality ..


----------



## DooKey

Joined the club with my EVGA RTX 3080 Ultra Gaming. I'm currently undervolted at 875mv and running at 1920mhz. Cool and quiet with great performance.


----------



## KenjiS

Received my FTW3 Ultra today  Just getting it all setup


----------



## SoldierRBT

EVGA RTX 3080 FTW3 Ultra owners. Does MSI Afterburner let you control all 3 fans properly?


----------



## KenjiS

https://www.3dmark.com/3dm/51362000?



Just a bit of tweaking, im about to do Time Spy Extreme for a better indicator of uplift.. +113 core and +200 mem, I might have a little more in it however

Comparing to my 1080 Ti result, its a 66% uplift in Graphics score which I think sounds pretty good


----------



## Dalekdoc

Has anyone flashed to Zotac 3080 Trinity with the Palit OC bios yet? Both are reference design so it should work, was wondering if anyone had results to share. Still waiting for my Zotac to arrive..


----------



## ELCID777

Dalekdoc said:


> Has anyone flashed to Zotac 3080 Trinity with the Palit OC bios yet? Both are reference design so it should work, was wondering if anyone had results to share. Still waiting for my Zotac to arrive..


Same, just waiting for someone to report on if it works or not. The Zotac Trinity 3080 is such a power starved card, no matter what you do, it will throttle down clocks because of that ridiculous power limit.


----------



## Masayama

I tried out strix bios on msi gaming x trio. Very stable and high clocked (2100mhz), which is awesome...
The fan stop seems to be working, and it seems to be cooling even on the automatic setting.
my time spy score : https://www.3dmark.com/spy/14393865


----------



## Mucho

It looks like Ref PCB cards are only able to use a bios of another Ref PCB card. Flashed the Inno3D bios to the Palit, Boost at 1770 like the Inno3D and PL at 340W. After flashing back my Palit bios my PL was back at 350W.


----------



## VPII

zhrooms said:


> Very interesting, this definitely needs to be investigated, sadly I can't deep dive into it myself since I decided not to get a 3080, possibly not 3090 either, right now I'm tempted to wait for 3080 20GB variant.
> 
> We had a few 2080 Ti cards that had power limits enforced on a deeper level, didn't matter what you flashed, it simply wouldn't exceed it. I recall it happening to multiple Ventus and Lightning users, in other words MSI, what you're describing sounds similar but it's on Palit. Again, definitely needs to be looked into more. Great that you tried them all though!


Thank you @zhrooms I've been saying this all along but nobody actually listened to me. If you take my Palit GamingPro OC it does not matter which bios you flash to it or if you use the stock bios and increase the power limit by 9% for 350watt you wil still lose clocks the moment you reach 320watt and pass it. This card of mine has been flashed with 6 different bios and no bios perform as well as the original or stock bios.


----------



## DStealth

*VPII Confirming the same behaviour with my Palit card. Did you try Asus Strix OC already . As I'm not home for a couple of days to test ?*


----------



## Vapochilled

We would need a 4


DStealth said:


> *VPII Confirming the same behaviour with my Palit card. Did you try Asus Strix OC already . As I'm not home for a couple of days to test ?*


Asus Strix is. 3x pin card. I don't think it would work...


----------



## ssgwright

just got my ABS prebuilt comp from newegg... I got a ASUS TUF 3080!!! overclocks like a beast! running +130 on the core 800 on the mem


----------



## BluePaint

Thanks for the Strix BIOS!!!

MSI Gaming X with Strix BIOS on air:
19 754 GPU Time Spy (core 2117 avg, 2145 max, vram +1140)
12 586 Port Royale (core 2123 avg, 2160 max, vram +1125)

I tried the FTW BIOS too but couldn't really achieve better scores than with default BIOS (12480 in PR, 19627 in TS). The additional 40W PL help ofc, but it seems to me that the voltage curve of the Strix is better tuned out of the box in the critical clock range between 2100 and 2200Mhz (when setting +100Mhz). Still had to limit max clock in curve editor to 2175Mhz (isn't reached due to temps) cause that's where my GPU gets definitely instable.


----------



## Nyt Ryda

Masayama said:


> I tried out strix bios on msi gaming x trio. Very stable and high clocked (2100mhz), which is awesome...
> The fan stop seems to be working, and it seems to be cooling even on the automatic setting.
> my time spy score : https://www.3dmark.com/spy/14393865


Can you still control the RGB lighting on the Gaming X Trio after the Strix bios was flashed ? Which tool did you use for the lighting ?


And to everyone who has been flashing BIOSes, is there any downsides to flashing a bios besides the obvious risk during the process - ie.is there any negative effects afterwards like having to install drivers a different way ? I recall seeing something about having to set the power limits after each driver installation even after flashing back to the stock BIOS else the card would throttle and perform worse than stock.


----------



## VPII

Vapochilled said:


> We would need a 4
> 
> 
> Asus Strix is. 3x pin card. I don't think it would work...


It does not matter. I have tried various bioses on this card, most of them 2 x 8pin cards with a 366, 370 and 375watt power limit. With all of these the card still act like 320watt is the power limit.


----------



## Chuckclc

Came in today, well yesterday technically.


----------



## Chuckclc

Already pushing the OC up and still no crash at +180 on core and +1000 on Mem.


----------



## DStealth

VPII said:


> It does not matter. I have tried various bioses on this card, most of them 2 x 8pin cards with a 366, 370 and 375watt power limit. With all of these the card still act like 320watt is the power limit.


Do you have any result with Strix BIOS. I have also tested all the rest with no luck.


----------



## VPII

DStealth said:


> Do you have any result with Strix BIOS. I have also tested all the rest with no luck.


I tried it last night but the result in Time Spy was lower than what I get with my Palit's stock bios. I'll put it in perspective, the base clocks on the Strix showed 1905mhz and my base clock is 1740.
If you take from base your boost clock would be +300 to +315mhz which is why my Palit running stock would boost up to 2055mhz sometimes it states 2040mhz but I think it is temperature related. So if you take the Strix with a base clock of 1905 would then boost to 2205 to 2220mhz, or that is how I logically understand it. Now I have run my Palit at +165mhz on the core and you'll see in the link below it shows 2220mhz max clock and this was the closest I got to 2000mhz average clock speed as per 3D Mark.



https://www.3dmark.com/spy/14243859



Now take the run below which was done with the Strix Bios. Yes the max clock speed only shows 2055, but I cannot see that the max boost would only be 150mhz more than base clock as with basically all of the previous generations it was around 300mhz above base clock. But here is the result running with the Strix bios and increased power limit. I checked the power usage and it was around 440watt. Now when I run with my Palit bios and increased power limit I'll see max 336watt or there about.



https://www.3dmark.com/spy/14387607



I will however give it another try now as yesterday I was in a rush.


----------



## Chuckclc

Got some work to do here amongst R5 3600 guys in Time Spy. Looks like his 3800mhz RAM to my 3200mhz was the difference? https://www.3dmark.com/3dm/51371482?











Good on Port Royal though. https://www.3dmark.com/pr/374726


----------



## VPII

DStealth said:


> Do you have any result with Strix BIOS. I have also tested all the rest with no luck.


Okay I tried now again running without the power limit increase and with. The runs are terrible compared to what I can do with the card running the Palit bios. I have been in contact with Palit support regarding this and they actually got back to me, but explained to me with 2 x 8pin you get 2 x 150watt and the pcie will give you 66watt. Seriously, they need to learn a bit. First of all one 8pin connected can in fact give you more than 180 watt. I mean that is 180watt / 12volt = 15amps and you can do a little more than 15amps on these wires. However, I explained to them that my issue is that no matter I increase the power limit by 9% to 350watt, I still lose clocks at 320watt or higher. I mean my actual power draw is lower than 350watt even 340watt when running like Tme Spy or Port Royal, why can't I keep my clocks right through the bench.


----------



## Nyt Ryda

VPII said:


> However, I explained to them that my issue is that no matter I increase the power limit by 9% to 350watt, I still lose clocks at 320watt or higher. I mean my actual power draw is lower than 350watt even 340watt when running like Tme Spy or Port Royal, why can't I keep my clocks right through the bench.


I noticed this while watching people doing shunt modding.

Shunt modding just one 8 pin was not enough as it appeared that as soon as the other 8 pin connectors draws 150w it hits the power limit even if the shunt modded 8 pin is only 75w and you are far from the BIOS power limit.

Maybe this is why cards with 3 x 8 pins are doing much better with BIOS flashing, as they don't easily run into that 150w limit (not hardware limit, some other kind of artificial limit).


----------



## VPII

Nyt Ryda said:


> I noticed this while watching people doing shunt modding.
> 
> Shunt modding just one 8 pin was not enough as it appeared that as soon as the other 8 pin connectors draws 150w it hits the power limit even if the shunt modded 8 pin is only 75w and you are far from the BIOS power limit.
> 
> Maybe this is why cards with 3 x 8 pins are doing much better with BIOS flashing, as they don't easily run into that 150w limit (not hardware limit, some other kind of artificial limit).


Thanks, well I wish I knew as it is somewhat irritating. I mean my card can easily to 2145 to even 2160mhz core as I have tested it with games at 1080P to keep the clocks high.


----------



## KenjiS

Yikes I thought I wasnt doing half bad now i see others scores D:

I'm still figuring out X1, Its been a long time


----------



## Gunnutzz467

BluePaint said:


> Thanks for the Strix BIOS!!!
> 
> MSI Gaming X with Strix BIOS on air:
> 19 754 GPU Time Spy (core 2117 avg, 2145 max, vram +1140)
> 12 586 Port Royale (core 2123 avg, 2160 max, vram +1125)
> 
> I tried the FTW BIOS too but couldn't really achieve better scores than with default BIOS (12480 in PR, 19627 in TS). The additional 40W PL help ofc, but it seems to me that the voltage curve of the Strix is better tuned out of the box in the critical clock range between 2100 and 2200Mhz (when setting +100Mhz). Still had to limit max clock in curve editor to 2175Mhz (isn't reached due to temps) cause that's where my GPU gets definitely instable.


What is your max watt draw? I’m seeing ~430 on the FTW bios.


----------



## VoRtAn

Card - *MSI 3080 TRIO*
bios, _trio latest version (temperature) vs strix uploaded here_
*stock, no oc*, locked on both runs fans at same speed for result be fair, +/- 1950rpm, asus bios reporting on afterburner 20W more, same clocks, same final fps, check graphic at your right








ezgif-com-gif-maker


Image ezgif-com-gif-maker hosted in ImgBB




ibb.co





port royal later on same conditions


----------



## vedomedo

Maybe a dumb question seeing as I assume most people here went for a 3x8pin / higher end card but. Has anyone tried flashing vbios on the Ventus oc? In theory it should be able to use the TUF OC bios and get higher powerlimit, no?


----------



## MikeGR7

vedomedo said:


> Maybe a dumb question seeing as I assume most people here went for a 3x8pin / higher end card but. Has anyone tried flashing vbios on the Ventus oc? In theory it should be able to use the TUF OC bios and get higher powerlimit, no?


In theory yes but so far people with 2X8 Pins see no major improvements from such small changes and even regressions.
Things may change when Gigabyte Master vBios becomes available.
Until then i say stick to stock.


----------



## vedomedo

MikeGR7 said:


> In theory yes but so far people with 2X8 Pins see no major improvements from such small changes and even regressions.
> Things may change when Gigabyte Master vBios becomes available.
> Until then i say stick to stock.


Fair play. Thanks for the quick response.


----------



## MikeGR7

VoRtAn said:


> Card - *MSI 3080 TRIO*
> bios, _trio latest version (temperature) vs strix uploaded here_
> *stock, no oc*, locked on both runs fans at same speed for result be fair, +/- 1950rpm, asus bios reporting on afterburner 20W more, same clocks, same final fps, check graphic at your right
> 
> 
> 
> 
> 
> 
> 
> 
> ezgif-com-gif-maker
> 
> 
> Image ezgif-com-gif-maker hosted in ImgBB
> 
> 
> 
> 
> ibb.co
> 
> 
> 
> 
> 
> port royal later on same conditions


Thank you for this comparison.

Interesting observations are that:

A. At same settings there's no performance regression. Peace of mind confirmation.
B. Even with only 20W more power draw, i can see the strix frequency diagram that the frequency is even more stable.


----------



## Daepilin

Just saw in the OP that strix non oc is at the same power target/limit as oc? is that confirmed by something or just assumed (2080ti was different between the two)


----------



## Chrisch

Anyone else with TUF OC having the same problem hitting max 355W with PT @ 110%?


----------



## Sir BLaDE

Chrisch said:


> Anyone else with TUF OC having the same problem hitting max 355W with PT @ 110%?


Yes

Enviado desde mi Redmi Note 7 mediante Tapatalk


----------



## criminal

Mine will be here tomorrow. I ended up getting lucky and getting an Asus TUF like I wanted. Coming from a 2070 Super, so should be a pretty big upgrade.


----------



## BluePaint

Gunnutzz467 said:


> What is your max watt draw? I’m seeing ~430 on the FTW bios.


450 Strix OC, FTW just above 410 and 360 with original MSI Gaming BIOS in 3dmark benches.

It seems to me that the Strix OC has a better voltage curve which is easier to get stable than the FTW. With FTW bios i wasn't really able to beat my 3dmark stock scores, despite +50W PT. Strix was much easier to get stable frequencies > 2100Mhz.


----------



## Vapochilled

Gigabyte Eagle OC with Gaming OC bios custom voltage curve: I see 360W max (thats 20W improvement over default eagle oc)

3dMark firestrike 4k
Graphics score11 537
Graphics test 163.67 FPS
Graphics test 241.39 FPS

9500 score for Timespy 4k

No point on having combined results ... i have a 6700k lol
I did undervolt and i see 350W usage at 1920mhz with 0.912 in the most heavy scenes of timespy 4k

My guess is that flashing higher TDP bios is only useful if you do custom curves. I came up 400 points .....


----------



## Gunnutzz467

BluePaint said:


> 450 Strix OC, FTW just above 410 and 360 with original MSI Gaming BIOS in 3dmark benches.
> 
> It seems to me that the Strix OC has a better voltage curve which is easier to get stable than the FTW. With FTW bios i wasn't really able to beat my 3dmark stock scores, despite +50W PT. Strix was much easier to get stable frequencies > 2100Mhz.


You worried about longevity of the card with strix bios on the trio and pulling ~100w over stock?


----------



## BluePaint

Gunnutzz467 said:


> You worried about longevity of the card with strix bios on the trio and pulling ~100w over stock?


Not really. I probably won't have enough time to degrade it. Also, will be using it undervolted for many games where I want it silent. It's just nice to know that the power is there when I need it


----------



## Talon2016

BluePaint said:


> Not really. I probably won't have enough time to degrade it. Also, will be using it undervolted for many games where I want it silent. It's just nice to know that the power is there when I need it


This. The main benefit of the much higher power limit is the ability to undervolt while overclocking and not hitting that power ceiling. That keeps clocks nice and stable while keeping power draw and temps in check.


----------



## locc

Chrisch said:


> Anyone else with TUF OC having the same problem hitting max 355W with PT @ 110%?


Yes, I have the same problem. Max power is 105%, 355W, instead of 110% and 375W what reviewer had. Slider goes to 110% but in reality it only gives 105%.
Other problem is that fans run at 53% even when gpu is under 30C and there is no load at all. Fans only stop when monitor goes to sleep. I've tested the power limit 'limitation' and fan problem multiple times with different software and drivers.


----------



## KingEngineRevUp




----------



## Vaesauce

Sup dudes,

Got my Aorus 3080 Master yesterday.
Just now catching up on this entire thread lol. I would help with uploading the Master's vbios but GPU-Z can't pull it. I get the message " BIOS reading not supported on this device ", which I assume it's baked into the GPU? Or perhaps I need to use a different version?

Regardless, I haven't really ran any timespy benchmarks because it includes CPU and i'm on a stock 3700x but I do remember my GPU score being 19000-19100 in that range. That's with it being overclocked by +130Core Clock and +1200 Memory Clock with the fan at 100%.

This was my Port Royal:


https://www.3dmark.com/3dm/51360833



+130Core +1200 Memory at 100% fan speed. If I use my custom fan so that it doesn't run at 100% lol, the temps sit around 57-61c.

I don't overclock much, or at all so I'm not well educated on this type of stuff. I'll try to answer questions if there are any about the Aorus 3080 Master.... 


P.S. Card is huge as hell, heavy. Sags and doesn't come with an Anti-Sag bracket lol. LCD screen displays very crisp GIFs.


----------



## ssgwright

can someone post up the flashing tools?


----------



## KingEngineRevUp

Talon2016 said:


> https://www.3dmark.com/3dm/51349539
> 
> 
> -- 19,680 GPU in Time Spy.
> 
> Asus Strix OC vBIOS just made the FTW3 3080 the card to get IMO. With the Asus vBIOS it is basically shunt modding itself and hugely under reporting power draw and the card holds crazy high boost now. I just managed to score the #1 spot in the US with this vBIOS on my FTW3 Ultra. Reported power draw under max overclock and max fans was around 330w. It's under reporting but the card performance is still scaling and the clocks are boosting to over 2100Mhz and holding near 2100Mhz the entire TimeSpy run. Not quite sure of how or why it's doing this yet, but its working and working well .
> 
> Fans run at same 3000rpm max so no issues there. One DP was deactivated, but haven't tried HDMI. Works for me as I use 2 DP ports for my monitors and they work.
> 
> Asus Strix OC 3080 vBIOS shared by a nice new owner over at reddit. They did us a solid!
> 
> *Asus Strix OC 3080 vBIOS*
> 
> 
> 
> 
> 
> 
> 
> 
> File on MEGA
> 
> 
> 
> 
> 
> 
> 
> mega.nz


WOW, how are your temperatures at 60C? What is your ambient temperature? In a 80F room, my fans at 100% can only keep the card at 64C.

EDIT: NVM, Afterburner wasn't putting the 3rd fan at 100%... After flashing, does Afterburner control the fans better? I don't like Precision X1.


----------



## Chrisch

locc said:


> Yes, I have the same problem. Max power is 105%, 355W, instead of 110% and 375W what reviewer had. Slider goes to 110% but in reality it only gives 105%.
> Other problem is that fans run at 53% even when gpu is under 30C and there is no load at all. Fans only stop when monitor goes to sleep. I've tested the power limit 'limitation' and fan problem multiple times with different software and drivers.


Yeah i see that with the fans. After testing severel things i noticed fans will only stop If you dont have any programm / tool open. Only GPU tweak it is no problem. Browser, Afterburner, gpuz will Trigger the fans.


----------



## Mucho

VPII said:


> It does not matter. I have tried various bioses on this card, most of them 2 x 8pin cards with a 366, 370 and 375watt power limit. With all of these the card still act like 320watt is the power limit.


Did you try the Inno3D X3 bios? It has a PL of 340. Maybe fliashing this bios will let you hit 340W?


----------



## VPII

Mucho said:


> Did you try the Inno3D X3 bios? It has a PL of 340. Maybe fliashing this bios will let you hit 340W?


Have not seen the bios so cannot say.


----------



## MrBridgeSix

Vapochilled said:


> Gigabyte Eagle OC with Gaming OC bios custom voltage curve: I see 360W max (thats 20W improvement over default eagle oc)
> 
> 3dMark firestrike 4k
> Graphics score11 537
> Graphics test 163.67 FPS
> Graphics test 241.39 FPS
> 
> 9500 score for Timespy 4k
> 
> No point on having combined results ... i have a 6700k lol
> I did undervolt and i see 350W usage at 1920mhz with 0.912 in the most heavy scenes of timespy 4k
> 
> My guess is that flashing higher TDP bios is only useful if you do custom curves. I came up 400 points .....


Same custom curve, different BIOS but higher results with the Gaming OC BIOS?

I also have a RTX 3080 Eagle should I flash the Gaming OC BIOS or not?


----------



## Vapochilled

MrBridgeSix said:


> Same custom curve, different BIOS but higher results with the Gaming OC BIOS?
> 
> I also have a RTX 3080 Eagle should I flash the Gaming OC BIOS or not?


Oi cara,

I did it without problems. I got better scores in 3dmark because I can clock higher when hitting power limits
As I said, during timespy 4k, I see 350w at 1920mhz in some situation's


----------



## VoRtAn

Well both bios compared
*Asus + 100W _ MSI + 250 FAN RPM*    , same card, different bios, daily pc configuration








MSI-vs-Asus-Bios-3080-TRIO-CARD


Image MSI-vs-Asus-Bios-3080-TRIO-CARD hosted in ImgBB




ibb.co





next...furmark to see power from each power connector at 99% usage


----------



## ssgwright

anyone flash a 3 power connector card bios to a 2 power connector card?


----------



## ssgwright

also anyone got an asus strix bios?


----------



## dvfedele

ssgwright said:


> also anyone got an asus strix bios?


Asus Strix OC 3080 vBIOS
*976 KB file on MEGA*


----------



## Nizzen

ssgwright said:


> anyone flash a 3 power connector card bios to a 2 power connector card?


It's working, but not helping. Powerlimit will be lower. Powerlimit divided on x3


----------



## shallow_

Swedish Strix 3080 video showing card taking off the Geforce logo bracket at 3 min 49 sec


----------



## ssgwright

ah... that sucks


----------



## KenjiS

https://www.3dmark.com/spy/14410414



This is the best I've managed so far with my eVGA FTW3

Anyone have any advice/comment on it? Im not sure if im doing well or not, I feel like the card has more to give I'm just not capable of getting it


----------



## Taggen86

I am a newbie to bios flashing. What do I need to bios flash my 3080 trio to the strix oc bios? The strix bios rom and *NVIDIA NVFlash 5.660.0? *Really tired of seing the card hitting the power limit all the time while gaming. Especially given I got a 850w psu and my temps are arouns 65c


----------



## VoRtAn

Taggen86 said:


> I am a newbie to bios flashing. What do I need to bios flash my 3080 trio to the strix oc bios? The strix bios rom and *NVIDIA NVFlash 5.660.0? *


Nvflash64 -6 nameofbios.rom

After the flash you will loose 250rpm max at 100%, asus bios tops at 3000rpm, msi bios 3250.
You'll loose algo rgb control on mystic light (i'm assuming as granted because that happened with ftw bios, didnt test it).
You gain TDP as expected, just remember, at same fan percentage, you'll have less rpm, for example 1950rpm, stock bios 60%, asus is 75%.

Daily on air, bios flashing is nonsense, for benching is another thing.

Perfect would have msi bios with asus tdp to have that 250rpm extra, dunno in the future...


----------



## ssgwright

soooo I went and ordered two 3080 (non FE) waterblocks ones from EK and the other from aquatune. So depending on which one arrives first I'll have an extra brand new block I'd be willing to sell here for $120 shipped. I'll post up in the sale section once I get em, figured I'd give a heads up here.


----------



## ssgwright

Taggen86 said:


> I am a newbie to bios flashing. What do I need to bios flash my 3080 trio to the strix oc bios? The strix bios rom and *NVIDIA NVFlash 5.660.0? *Really tired of seing the card hitting the power limit all the time while gaming. Especially given I got a 850w psu and my temps are arouns 65c


on the first page of this thread there's detailed instructions


----------



## freejak13

KenjiS said:


> https://www.3dmark.com/spy/14410414
> 
> 
> 
> This is the best I've managed so far with my eVGA FTW3
> 
> Anyone have any advice/comment on it? Im not sure if im doing well or not, I feel like the card has more to give I'm just not capable of getting it


Mind sharing your settings? I have an ftw3 ultra flashed with the strix bios and i'm unable to crack 19k in timespy. Perhaps I just lost the silicon lottery. Thanks


----------



## KenjiS

freejak13 said:


> Mind sharing your settings? I have an ftw3 ultra flashed with the strix bios and i'm unable to break 18k in timespy. Perhaps I just lost the silicon lottery. Thanks


I'm using X1 for it and the stock BIOS, For this run I did +100 on the voltage, +125 on the Core and +250 on memory, Power limit and Temp Limit sliders pushed as far as they can go, for the run I also just put all 3 fans to 100% I also killed every app on my system besides Steam (because my 3dmark is Steam)

You can try using the OC Scanner thing in X1, I've not had fantastic results from it YMMV

When I tried to go any higher on mem or core I got score regression which made me dial it back down again, I feel theres still more in there its just a matter of tinkering a bit.


----------



## freejak13

KenjiS said:


> I'm using X1 for it and the stock BIOS, For this run I did +100 on the voltage, +125 on the Core and +250 on memory, Power limit and Temp Limit sliders pushed as far as they can go, for the run I also just put all 3 fans to 100% I also killed every app on my system besides Steam (because my 3dmark is Steam)
> 
> You can try using the OC Scanner thing in X1, I've not had fantastic results from it YMMV
> 
> When I tried to go any higher on mem or core I got score regression which made me dial it back down again, I feel theres still more in there its just a matter of tinkering a bit.


Thanks. Just tried your settings and crashed near the end. I'm beginning to think it might be my mobo or cpu. Every gpu I've put into this system overclocked terribly over the years.

Edit: dropping to +120 got me through. Barely over 19k though.



https://www.3dmark.com/3dm/51399133?


----------



## Mucho

VPII said:


> Have not seen the bios so cannot say.


Here you go:






File-Upload.net - Datei nicht gefunden


Leider konnte Ihre hochgeladene Datei nicht gefunden werden. Laden Sie die Datei neu hoch.



www.file-upload.net


----------



## KenjiS

freejak13 said:


> Thanks. Just tried your settings and crashed near the end. I'm beginning to think it might be my mobo or cpu. Every gpu I've put into this system overclocked terribly over the years.
> 
> Edit: dropping to +120 got me through. Barely over 19k though.
> 
> 
> 
> https://www.3dmark.com/3dm/51399133?



I'm still trying to hit 19k. I'm pretty sure theres something I'm missing.. I've actually barely gotten to play anything on my rig ive spent all my time trying to get my 3dmark score higher

I might try lowering the voltage to 80 or something, i feel its all about a few degrees or holding the boost just a little longer right now


----------



## finalheaven

Whats the average Time Spy graphics score without any overclocking? I have the FE and I score approximately between 17900-18100. Is that normal?


----------



## ssgwright

that's good, with my still on air stable overclock I scored 18,589


----------



## gerardfraser

Vaesauce said:


> Sup dudes,
> Got my Aorus 3080 Master yesterday.
> Just now catching up on this entire thread lol. I would help with uploading the Master's vbios but GPU-Z can't pull it. I get the message " BIOS reading not supported on this device ", which I assume it's baked into the GPU? Or perhaps I need to use a different version?


Looks like interesting BIOS that some people would love to try. Quick guide if you want to share.

Flash guide:To save BIOS for upload and to flash BIOS
1. Download nvflash from (NVIDIA NVFlash (5.660.0) Download )Unpack zip to a folder/desktop: EG:Name folder on desktop/NVFlash_3080
2. Go to Device Manager and disable video card(EG:3080 GTX I saved my BIOS without doing this and flashed ,but your choice)
3. Run Command Prompt as an administrator(important)
4. Patse this in command prompt.Change yourname to computer name: CD C:\Users\yournamehere\OneDrive\Desktop\nvflash_3080 or copy folder path from desktop of NVFlash_3080
5. To save BIOS patse this in command prompt: nvflash64 --save biosname.rom
5. To flash one card paste this in command prompt: nvflash64 -6 biosname.rom
6. Follow on screen prompts.
7. Re-enable video card or reinstall drivers which would be the best.

If you have more than one card and want to flash a specific one:
nvflash --list
Look at the index number (0, 1, 2, 3) and then
nvflash -6 --index <#> BIOSNAME.rom


----------



## Taggen86

VoRtAn said:


> Nvflash64 -6 nameofbios.rom
> 
> After the flash you will loose 250rpm max at 100%, asus bios tops at 3000rpm, msi bios 3250.
> You'll loose algo rgb control on mystic light (i'm assuming as granted because that happened with ftw bios, didnt test it).
> You gain TDP as expected, just remember, at same fan percentage, you'll have less rpm, for example 1950rpm, stock bios 60%, asus is 75%.
> 
> Daily on air, bios flashing is nonsense, for benching is another thing.
> 
> Perfect would have msi bios with asus tdp to have that 250rpm extra, dunno in the future...


So you dont recommend flashing the trio for daily gaming on air? I game ca 2 hours per day and the card is at 99% utilization always when gaming. Maybe it is better to flash it with a bios that only increases the power limit by max 30 or 50w in my case? (e.g. the evga one)


----------



## KenjiS

finalheaven said:


> Whats the average Time Spy graphics score without any overclocking? I have the FE and I score approximately between 17900-18100. Is that normal?


I know OOB without doing anything when I did a simple baseline it was something like 16k in my case but that might be a bit artifically low


----------



## Talon2016

KenjiS said:


> I'm using X1 for it and the stock BIOS, For this run I did +100 on the voltage, +125 on the Core and +250 on memory, Power limit and Temp Limit sliders pushed as far as they can go, for the run I also just put all 3 fans to 100% I also killed every app on my system besides Steam (because my 3dmark is Steam)
> 
> You can try using the OC Scanner thing in X1, I've not had fantastic results from it YMMV
> 
> When I tried to go any higher on mem or core I got score regression which made me dial it back down again, I feel theres still more in there its just a matter of tinkering a bit.


Try this. Go into X1 and set power limit and temp limit to max. Set fans to max. Leave voltage slider alone at stock. 

Close (not minimize) X1 and open MSI Afterburner. 

Set fans to 100% in Afterburner, set power limit and temp slider to max. Set your +125mhz on the core and set +1100Mhz on memory and run again. Your +250mhz on the memory is too low.


----------



## VoRtAn

Taggen86 said:


> So you dont recommend flashing the trio for daily gaming on air? I game ca 2 hours per day and the card is at 99% utilization always when gaming. Maybe it is better to flash it with a bios that only increases the power limit by max 30 or 50w in my case? (e.g. the evga one)


Hi.
_Maybe the words before cause misunderstanding, my bad ._
What i meant is that the beneficts for gaming are minimum, and raising the power target for a daily based usage will be more noticeable if your card gets low temps, so assuming nobody plays 100% fan speed, the higher tdp bios flash will have more sense on people on watercooling.

*For example, average speed on my timespy msi 3080 with the strix bios run is 2092mhz (average is what counts daily) and i was using 100% fan speed
Daily stock bios, with less power consumption, i can play everything above 2Ghz, average +/- 2040mhz considering the heavier/lighter games, those runs were not max oc out of the card, i run some clocks that i know 100% safe and stable all around.*

I totally understand that 3dmark is heavier and with asus bios i will get even more average clocks gaming but the gains vs watt consumption in my opinion for more 1% 2% extra performance more is not a reason for flashing the bios, for benching definetely, but if you flash asus bios on the msi you'll only use the extra TDP if you want of course  and you'll have full power available click distance 

All depends also what type of end user we are talking about.

I'm pushing MSI support showing results with other bios (evga and now asus) to see if i can get a bios release with extra TDP to keep the fan curve perfect and the rgb fully functional (honerstly i don't use it haha).

One thing that i didn't test it yet is undervolting with asus bios using tdp to help, daily undervolting is a must with 30xx series, i don't remember the exact values because i didn't waste much time on that but from what i remember out of the box in quick tests i could run gaming 100% above 2Ghz around 0.950mv and around 1920Mhz around 0.830mv, something like that.

Cheers m8.


----------



## Vapochilled

I think its better to build a table to make things more clear the 1080p, 1440p, 4k and 5k speads as well as tdp. Your max clock and avg will be less with 4 and 5k tests. I dont play bellow 5k so, its worthless to see 1080p tests for me


DefaultCurrent BIOSCustom Curve1080p1440p4kGigabyte Eagle OC
340WGigabyte Gaming OC 370WYesTimeSpy - 9504
Firestrike - 11530
Max tdp seen: 360W
Lowest clock: 1905
Avg clock:
Max peak:etcblahey


----------



## cstkl1

Will upload the strix 3080 bios on sunday. If nobody uploaded it yet. Collecting the card tommorow but heading out of state for a wedding.

3090 strix.. price rocketed to 1800..


----------



## Arni90

KenjiS said:


> I'm still trying to hit 19k. I'm pretty sure theres something I'm missing.. I've actually barely gotten to play anything on my rig ive spent all my time trying to get my 3dmark score higher
> 
> I might try lowering the voltage to 80 or something, i feel its all about a few degrees or holding the boost just a little longer right now


Use a custom voltage/frequency curve: I recommend using only the 900 and 925 mV points for Time Spy. Use the slider to pull down every point, then select the 900mV point and increase it to 1995 MHz or so, as well as the 925 mV point to 2010 or 2025 MHz, then hit apply. You should now have a flat V/F-curve.

Using this principle on my watercooled 3080, I was able to achieve 19K GPU Score: https://www.3dmark.com/spy/14322558

Also, memory timings are incredibly important for the Time Spy CPU test

Oh, and for those who are wondering what a shunt modded RTX 3080 can do: https://www.3dmark.com/spy/14407664
Though I think I'm approaching CPU limitations


----------



## rankftw

I've just ordered a Gigabyte 3080 Eagle OC. What BIOS are compatible with this card and which has the highest power limit?


----------



## KHUNGOLF

Will ASUS TUF non-OC flash to "OC Bios" increase Power Limit ?


----------



## BluePaint

Arni90 said:


> Oh, and for those who are wondering what a shunt modded RTX 3080 can do: https://www.3dmark.com/spy/14407664
> Though I think I'm approaching CPU limitations


How much power does it draw with shunt mod? 
With 450W max from Strix OC bios + 3900X i got a similar GPU score https://www.3dmark.com/3dm/51369600


----------



## asdkj1740

3080 / 3090 / 3070 Gigabyte Eagle Gaming OC & Vision Power Connector Concerns


***UPDATE 1.5 IMPORTANT INFO To clarify for everyone and any one new here the cards affected are as follows re serial number WK39 onwards will have the revised new connector block *UPDATE however some cards may be mixed and still could be on the old connector block even after WK39 WK38...




www.overclockers.co.uk




to gigabyte users, go ask for the new adaptor in case the original one failed.


----------



## freejak13

KHUNGOLF said:


> Will ASUS TUF non-OC flash to "OC Bios" increase Power Limit ?


non-OC and OC have the same power limits. You're mainly looking at a boost clock increase.


----------



## Bennimaru

Arni90 said:


> Use a custom voltage/frequency curve: I recommend using only the 900 and 925 mV points for Time Spy. Use the slider to pull down every point, then select the 900mV point and increase it to 1995 MHz or so, as well as the 925 mV point to 2010 or 2025 MHz, then hit apply. You should now have a flat V/F-curve.
> 
> Using this principle on my watercooled 3080, I was able to achieve 19K GPU Score: https://www.3dmark.com/spy/14322558
> 
> Also, memory timings are incredibly important for the Time Spy CPU test
> 
> Oh, and for those who are wondering what a shunt modded RTX 3080 can do: https://www.3dmark.com/spy/14407664
> Though I think I'm approaching CPU limitations



Thanks for sharing with us. With your settings i got my best score on my Asus TUF with the lowest temp and power draw i possibly could.  



https://www.3dmark.com/3dm/51423410?



Thanks again


----------



## freejak13

asdkj1740 said:


> 3080 / 3090 / 3070 Gigabyte Eagle Gaming OC & Vision Power Connector Concerns
> 
> 
> ***UPDATE 1.5 IMPORTANT INFO To clarify for everyone and any one new here the cards affected are as follows re serial number WK39 onwards will have the revised new connector block *UPDATE however some cards may be mixed and still could be on the old connector block even after WK39 WK38...
> 
> 
> 
> 
> www.overclockers.co.uk
> 
> 
> 
> 
> to gigabyte users, go ask for the new adaptor in case the original one failed.


I had this issue with the gigabyte gaming oc. Returned it and fortunately managed to get a ftw3 ultra as a replacement.


----------



## darkangelism

If i want to replace the OC bios mode do i have to boot the card in that mode to flash or can i flash either the OC or normal bios with nvflash


----------



## JackCY

Pricing when even shown on an eshop:
"MSRP" 825 USD, "should" be this
pre-order 1175 USD, actual price to be stuck on a waiting list for months

Seriously, this situation is not far off from the communistic 60-80s, want to buy something, get stuck on an endless waiting list. Difference is, this time you're getting ripped off to the moon.

Haven't seen a single 3000 series card for sale since launch, never any anywhere if even listed at all. The supply is pathetic as always.


----------



## freejak13

Bennimaru said:


> Thanks for sharing with us. With your settings i got my best score on my Asus TUF with the lowest temp and power draw i possibly could.
> 
> 
> 
> https://www.3dmark.com/3dm/51423410?
> 
> 
> 
> Thanks again


That's a great score for the tuf. Is that stock bios or something else?


----------



## BluePaint

That TUF score looks great for the average frequency of 2014Mhz. I needed 100Mhz more for just 200 points on top. Maybe the new driver has some optimazations. Wouldn't be surprising so early in the generation. Have to check that later.


----------



## Vapochilled

Could you guys share the Spy extreme results?


----------



## BluePaint

https://www.3dmark.com/spy/14244983 TSE 9890 GPU score with original MSI BIOS PL 350. 
haven't really tried with Strix bios yet


----------



## Bennimaru

freejak13 said:


> That's a great score for the tuf. Is that stock bios or something else?


It's the stock bios, i've tried the OC one but no improvements at all so i went back the the stock, performance of course


----------



## Bennimaru

Vapochilled said:


> Could you guys share the Spy extreme results?


Here's mine



https://www.3dmark.com/spy/14426740


----------



## finalheaven

KenjiS said:


> I know OOB without doing anything when I did a simple baseline it was something like 16k in my case but that might be a bit artifically low


Are you talking about the combined score or only the graphics score? I was only talking about the graphics score.

I am at +0/+0 (no overclocking) with the FE 3080. But I increased both the power and temp curve to 115/90.



https://www.3dmark.com/3dm/51426750


----------



## freejak13

PSA for those flashing the strix oc bios on your ftw3 ultra: you'll lose the ability to control rgb as was already reported but you'll also lose one of the displayports. Only two of mine works. Flashing back to the factory bios restores all functionality.


----------



## asdkj1740

freejak13 said:


> PSA for those flashing the strix oc bios on your ftw3 ultra: you'll lose the ability to control rgb as was already reported but you'll also lose one of the displayports. Only two of mine works. Flashing back to the factory bios restores all functionality.


how about those icx temp sensors on pcb? are they still working?


----------



## Talon2016

asdkj1740 said:


> how about those icx temp sensors on pcb? are they still working?


ICX temps and sensors all work. Oddly enough my RGB was working fine with Strix vBIOS on my FTW3 Ultra. Maybe I need to go back and test this again.


----------



## Chrisch

is here anyone with a MSI Gaming X Trio and maybe a STRIX BIOS? if yes, does it work?


----------



## arrow0309

Chrisch said:


> is here anyone with a MSI Gaming X Trio and maybe a STRIX BIOS? if yes, does it work?


Just get a look at the last 5 pages, they don't talk about something else.


----------



## Vaesauce

gerardfraser said:


> Looks like interesting BIOS that some people would love to try. Quick guide if you want to share.


Thank you bud!

For anyone who is curious,

Here is the Aorus 3080 Master bios... obviously, use at your own discretion. This card is a TWO 8-PIN card.

Out of the box, 1845 MHz clock.

From my own card (fat heatsink lul), on Port Royal, it does overboost past 2000mhz (with no OC) but that is also because of the cooling solution this card has.

File-Upload.net - Aorus3080Master.zip


----------



## owntecx

Hi guys, got my asus tuf oc 3080 today. Long story short, after 2 drivers reinstall my idle power its 100w and stuck at a minimum 1785 freq. any sugestion to fix this?


----------



## vigorito

Any owners of msi ventus model,how are you satisfy overall,2100mhz possible?


----------



## Riadon

Vaesauce said:


> Thank you bud!
> 
> For anyone who is curious,
> 
> Here is the Aorus 3080 Master bios... obviously, use at your own discretion. This card is a TWO 8-PIN card.
> 
> Out of the box, 1845 MHz clock.
> 
> From my own card (fat heatsink lul), on Port Royal, it does overboost past 2000mhz (with no OC) but that is also because of the cooling solution this card has.
> 
> File-Upload.net - Aorus3080Master.zip


What's the power limit on that card?


----------



## Vapochilled

Vaesauce said:


> Thank you bud!
> 
> For anyone who is curious,
> 
> Here is the Aorus 3080 Master bios... obviously, use at your own discretion. This card is a TWO 8-PIN card.
> 
> Out of the box, 1845 MHz clock.
> 
> From my own card (fat heatsink lul), on Port Royal, it does overboost past 2000mhz (with no OC) but that is also because of the cooling solution this card has.
> 
> File-Upload.net - Aorus3080Master.zip



Wow!!! Thanks!!! 2x pin card??? I had in n mind aorus flag was like the Strix in asus with 3x pins all the way....

So, what's the tdp of aorus? 
I'm planing to flash it now on my eagle oc


----------



## Jedzy

Riadon said:


> What's the power limit on that card?


Just checked and it seems to be 370w although on the first page it says it goes up to 380w. It's got dual bios so I'm guessing one is normal and one is the OC with 380w limit.


----------



## ssgwright

Can someone please post up the ASUS TUF OC bios? I have the non OC card and want to see how it does with the OC bios.


----------



## spajdr

*Vaesauce *can you please also provide second BIOS? if it's any different in power limit  

*Vapochilled *Let me know how that AORUS bios worked for you  cheers


----------



## Purple_Light

ssgwright said:


> Can someone please post up the ASUS TUF OC bios? I have the non OC card and want to see how it does with the OC bios.


This have been posted twice since page 30, you'll have to dig a little bit


----------



## Jedzy

ssgwright said:


> Can someone please post up the ASUS TUF OC bios? I have the non OC card and want to see how it does with the OC bios.











[Official] NVIDIA RTX 3080 Owner's Club


hi every one this is my asus rtx 3080 tuf oc performance bios enjoy https://fil.email/TlW9iNRD




www.overclock.net


----------



## Vaesauce

Vapochilled said:


> Wow!!! Thanks!!! 2x pin card??? I had in n mind aorus flag was like the Strix in asus with 3x pins all the way....
> 
> So, what's the tdp of aorus?
> I'm planing to flash it now on my eagle oc


The "Master" is a Two-pin but the "Xtreme" which is coming out in a week or two is a Three-Pin.

That said, the TDP is 370/380 I believe.

Anyone who is curious, the power limit is only 100% BUT... not all power limits are created equally lol. 100% power limit on the Aorus bios is pretty significant over everything else.

@spajdr the second BIOS is the Silent Bios, I haven't tried running it but I'm going to assume that the FAN profile, Clocks and everything is slower and based more on "silence" haha. I'll upload the Silent Bios later if anyone really wants it.

Let me know how the BIOS works on everyone's two pin cards 


P.S. The TDP on this card has to naturally be high because of the Heatsink and Fan profile. Along with the LCD screen...









So naturally, it may and hopefully, it gives everyone extra power.


----------



## Vapochilled

Vaesauce said:


> Thank you bud!
> 
> For anyone who is curious,
> 
> Here is the Aorus 3080 Master bios... obviously, use at your own discretion. This card is a TWO 8-PIN card.
> 
> Out of the box, 1845 MHz clock.
> 
> From my own card (fat heatsink lul), on Port Royal, it does overboost past 2000mhz (with no OC) but that is also because of the cooling solution this card has.
> 
> File-Upload.net - Aorus3080Master.zip



If you are dual bios what BIOS are you uploading here?


----------



## Vaesauce

Vapochilled said:


> If you are dual bios what BIOS are you uploading here?


The one I uploaded is the Default/OC one. I promise you, it's not the Silent bios


----------



## Nizzen

I collected some bioses:





NVIDIA GeForce RTX 30xx Bioser for flashing


Hvordan ta backup av gpu biosen: Quote CMD: (åpne som administrator) nvflash64 --protectoff nvflash64 --save 30xxModel.rom Hvordan flashe bios: nvflash64 --protectoff nvflash64 -6 biosnavn.rom Det finner du her: FLASH | GUIDE Nvflash for 30xx skjermkort: nvflash64.rar Nvflash for FE kort: Denne n...



www.diskusjon.no


----------



## Vapochilled

spajdr said:


> *Vaesauce *can you please also provide second BIOS? if it's any different in power limit
> 
> *Vapochilled *Let me know how that AORUS bios worked for you  cheers



After yes on the flash procedure... Screen went black.... Nothing else

I did a reboot...nothing.. but I could see the LED reads.. so windows is working

I had in my mind how to flash back.. I recorded all the steps thru keyboard hahaha in case the screen went black... So I flashed back eagle and gaming oc bios.. they both work..aorus didn't...

Someone else want to try?


----------



## freejak13

Vapochilled said:


> **** I'm doomed. Card brick.... After yes on the flash procedure... Screen went black.... Nothing else until now....
> 
> I did a reboot... And I see the system is login into windows..
> 
> I did windows key + open as admin... (. Doing without seeing... I recorded how to do it in case o brick)..
> Nothing... All black..I think I will need to take the card to a friend's place...


Use your onboard video to access windows and reflash from there.


----------



## Vapochilled

freejak13 said:


> Use your onboard video to access windows and reflash from there.


Back to life after flashing back my bios.
With aorus bios, image is always black. I had to flash back to gaming oc


----------



## doom26464

Snagged a Gigabyte Aoprous Master on a drop, stock was gone in like 7 minute tops. Should be here mid next week due to the Canadian holiday weekend.

I know its only a dual 8pin card but anyone know power limit on them? it does have dual bios so may play with flashing well see been awhile since I done gpu flashing. 

reviews are hard to find.


----------



## Vapochilled

Vaesauce said:


> The one I uploaded is the Default/OC one. I promise you, it's not the Silent bios


Conclusion: I have a gigabyte eagle oc, it works with gaming oc bios, but not with that aorus bios. (( 

Anyone else wants to give it a try?


----------



## gerardfraser

Vapochilled said:


> Conclusion: I have a gigabyte eagle oc, it works with gaming oc bios, but not with that aorus bios. ((
> 
> Anyone else wants to give it a try?


@Vaesauce 
Thank you for uploading the BIOS,awesome

@Vapochilled 
Well I tried the Gigabyte Aorus Master 380W for about and Hour on Asus Tuf NON-OC version

Gigabyte Aorus Master
Default clocks in benchmarking and couple games 
Core Clock - Up to 2070Mhz 
Memory- 9500Mhz

Plus 50 OC
Core Clock -Up to 2115Mhz - Crash in benchmarks
Memory-10100

Plus 40 OC
Core Clock -Up to 2100Mhz - OK in benchmark but crash in Nvidia performance turner.
Memory-10100

Overall BIOS is fine and works the same as any other BIOS I tried. I am back on Asus Tuf OC BIOS on the Asus Tuf NON-OC version. I will eventually try every BIOS but they will be all the same ,some will crash and some will run fine but the best BIOS will be the same Brand.


----------



## KenjiS

Talon2016 said:


> Try this. Go into X1 and set power limit and temp limit to max. Set fans to max. Leave voltage slider alone at stock.
> 
> Close (not minimize) X1 and open MSI Afterburner.
> 
> Set fans to 100% in Afterburner, set power limit and temp slider to max. Set your +125mhz on the core and set +1100Mhz on memory and run again. Your +250mhz on the memory is too low.


I was wondering if it was the program to be honest, Im a lot less familiar with X1 and sometimes it seemed to not be making sense, I also had to hit reset to default and re-enter everything a few times to get it to stick, I just updated to the new version released yesterday however and was planning to toy with it some more this weekend



Arni90 said:


> Use a custom voltage/frequency curve: I recommend using only the 900 and 925 mV points for Time Spy. Use the slider to pull down every point, then select the 900mV point and increase it to 1995 MHz or so, as well as the 925 mV point to 2010 or 2025 MHz, then hit apply. You should now have a flat V/F-curve.
> 
> Using this principle on my watercooled 3080, I was able to achieve 19K GPU Score: https://www.3dmark.com/spy/14322558
> 
> Also, memory timings are incredibly important for the Time Spy CPU test
> 
> Oh, and for those who are wondering what a shunt modded RTX 3080 can do: https://www.3dmark.com/spy/14407664
> Though I think I'm approaching CPU limitations


I dont dare touch my memory timings again, i JUST finally after a year got the 3600 CL16 stable (Partially, im stupid and misread my bios, My bad)

Ill take and play with this over the weekend too



finalheaven said:


> Are you talking about the combined score or only the graphics score? I was only talking about the graphics score.
> 
> I am at +0/+0 (no overclocking) with the FE 3080. But I increased both the power and temp curve to 115/90.
> 
> 
> 
> https://www.3dmark.com/3dm/51426750


Just graphics, but i stress it was a test run more or less "Is the new GPU working ok/stable/etc" not really for the score, iirc i hadnt killed all my background processes and stuff, it was just me seeing if it worked before i got too deep into things


----------



## treetops422

Has anyone done a in game benchmark with a 3080 on Call of Duty Modern Warefare? 4k ultra dx12


----------



## nievz

Vapochilled said:


> Back to life after flashing back my bios.
> With aorus bios, image is always black. I had to flash back to gaming oc


How did you flash it back to unbrick it? Did you try other ports while on the Aorus BIOS?


----------



## ssgwright

hey all, the link for the TUF OC bios is expired can someone post one up?


----------



## VPII

Okay, I've tried the Aorus Master bios on my Palit, and funny enough it was the best performing bios out of all the non-Palit I tested. Unfortunately the overclock results on it, not that great. The funny thing was when I reverted back to my Palit bios I wanted to do another run of 3dmark Time Spy just to see if my result was still close to my best of 18628, but I had to settle for a little lower, but at least it was okay. n You'll see the max core clock in the link.



https://www.3dmark.com/spy/14439189



So after this test run I decided to go back to my vcurve oc as it was stable in everything, but it seems there was a little or huge bump in the vcurve between the bios changes as this was the result. I'm adding a screen grab as well to show in MSI Afterburner that it is on vcurve oc.



https://www.3dmark.com/spy/14439322


----------



## Chrisch

here is a link with actually all 3080 files i have





__





AMPEREBIOS.rar beim Filehorst - filehorst.de






filehorst.de


----------



## KenjiS

https://www.3dmark.com/spy/14441850



Hit 18,868 graphics in X1, +130 core, +200 mem +100 voltage

I think I need to try MSI instead next.. FTW3 is a good OCing board though

-edit- That said, on the leaderboard im ranked 5th for all 3900x/3080 systems and im ranked 31st for all 3080 equipped builds


----------



## Vapochilled

nievz said:


> How did you flash it back to unbrick it? Did you try other ports while on the Aorus BIOS?


To unbrick you can either use the HDMI port of the motherboard (that used the CPU integrated VGA) or, you can do it like me, and do it all by your own mind under a black screen ! hahahhaa 
I reboot the pc, waited 20secs, inserted my windows password, then windows key + cmd + right arrow, and down arrow to open cmd as admin. Then i did cd c:\3080. nvflash64.exe --protectoff 
I could hear the windows sound. Meaning the cmd worked. Whenever you do the protectoff there is a windows sound and screen flicks.
After that i did nvflash64.exe -6 gigabyte_gaming_oc.rom
Waited 10sec. Press Y
Waited 10sec. Press Y

Unbrick sucess and all done with black screen hahahahah


----------



## Vapochilled

What i dont understand is ... 
Why my Gigabyte Eagle OC, is able to flash the Gaming OC - is a jump of 20W under load... BUT... whenever i flash that AORUS.... screen goes black  
Maybe Eagle OC cant handle the Aorus Master BIOS... 
But i can confirm Gaming OC works !! And i get a nice performance bump with my custom curve


----------



## Anthraksi

Vapochilled said:


> What i dont understand is ...
> Why my Gigabyte Eagle OC, is able to flash the Gaming OC - is a jump of 20W under load... BUT... whenever i flash that AORUS.... screen goes black
> Maybe Eagle OC cant handle the Aorus Master BIOS...
> But i can confirm Gaming OC works !! And i get a nice performance bump with my custom curve


Did you try other outputs? Aorus has 3x DP and 3x HDMI, so it is possible that it messes with the 3x DP and 2x HDMI config the Eagle and Gaming OC has and disables a port or two. Might be that you just happened to be plugged into the wrong port.


----------



## Vapochilled

Anthraksi said:


> Did you try other outputs? Aorus has 3x DP and 3x HDMI, so it is possible that it messes with the 3x DP and 2x HDMI config the Eagle and Gaming OC has and disables a port or two. Might be that you just happened to be plugged into the wrong port.


I can re-test but i did try all the DPs and none were working...... hum...


----------



## Reinhardovich773

Greetings @VPII how are you doing mate? So my Palit 3080 GamingPro OC card is supposed to ship this weekend for me and i would love it if you could share with me the voltage/frequency curve you used to achieve that screenshotted Time Spy score if possible. Thanks in advance!


----------



## VPII

Reinhardovich773 said:


> Greetings @VPII how are you doing mate? So my Palit 3080 GamingPro OC card is supposed to ship this weekend for me and i would love it if you could share with me the voltage/frequency curve you used to achieve that screenshotted Time Spy score if possible. Thanks in advance!


Hi there, I'l share with you vcurve frequencies at which voltage as I don't touch the voltages. But no worries will assist you as much as I can.


----------



## cstkl1

8 hrs of driving. to god knows what forsaken place

got it


----------



## Reinhardovich773

VPII said:


> Hi there, I'l share with you vcurve frequencies at which voltage as I don't touch the voltages. But no worries will assist you as much as I can.


Ok, looking forward to your esteemed assistance. Thank you so much for your time!


----------



## Nizzen

Vapochilled said:


> After yes on the flash procedure... Screen went black.... Nothing else
> 
> I did a reboot...nothing.. but I could see the LED reads.. so windows is working
> 
> I had in my mind how to flash back.. I recorded all the steps thru keyboard hahaha in case the screen went black... So I flashed back eagle and gaming oc bios.. they both work..aorus didn't...
> 
> Someone else want to try?


Just change displayport. This bios do not work wit all DP on my palit card. So try another port.

Max powerdraw is 340w on my palit 3080oc with Master bios. 350w with the new palit oc bios from palit.com


----------



## Vapochilled

Reinhardovich773 said:


> Ok, looking forward to your esteemed assistance. Thank you so much for your time!



Actually, it would be nice if we could build a undervolt table on the 1st page.
For example, i can do [email protected] or something like, but i guess someone wrote 2000 at 0.96 ?
The lowest volts we can achieve with stability, the more we can take from TDP.
When people here wrote they saw drops to 1600 and 1700mhz during Timespy @ 4k, thats because it pulls a lot of TDP under 4k on timespy.
Unlesss you have a custom curve with this 0.875 at 1880 then 0.9 at 1900 and [email protected], you will see those drops. 
Because under 4k testing on timespy, with just [email protected], i see 360W pulled.
for example, when i am on COD, at 4k, inside a building (easy rendering), i see [email protected] taking just 240W

It really depends on the scenario and resolution.
Could we do this table?


----------



## Vapochilled

Nizzen said:


> Just change displayport. This bios do not work wit all DP on my palit card. So try another port.
> 
> Max powerdraw is 340w on my palit 3080oc with Master bios. 350w with the new palit oc bios from palit.com


Haaaaaaa! So you also went black ?
Do you mean that on Palit you were able to have this BIOS Aorus master working after changing DP port ?


----------



## Nizzen

Vapochilled said:


> Haaaaaaa! So you also went black ?
> Do you mean that on Palit you were able to have this BIOS Aorus master working after changing DP port ?


My screen got black too. Changed DP fixed it. So yes, Master bios is working on Palit 3080 oc.


For me the new updated Palit bios is better than MAster bios for the Palit card. Master is 370w, but it do only 340w max on Palit. New palit bios is 350w, and does 350w. Atleast on my card.


----------



## Reinhardovich773

Vapochilled said:


> Actually, it would be nice if we could build a undervolt table on the 1st page.
> For example, i can do [email protected] or something like, but i guess someone wrote 2000 at 0.96 ?
> The lowest volts we can achieve with stability, the more we can take from TDP.
> When people here wrote they saw drops to 1600 and 1700mhz during Timespy @ 4k, thats because it pulls a lot of TDP under 4k on timespy.
> Unlesss you have a custom curve with this 0.875 at 1880 then 0.9 at 1900 and [email protected], you will see those drops.
> Because under 4k testing on timespy, with just [email protected], i see 360W pulled.
> for example, when i am on COD, at 4k, inside a building (easy rendering), i see [email protected] taking just 240W
> 
> It really depends on the scenario and resolution.
> Could we do this table?


Yeah that would be great! Though of course a screenshot of his full table would be greatly appreciated of course. Out of the many results i've seen with Palit 3080s all over the web, i think @VPII has achieved the best possible score in Time Spy. He did have to set the fans to 100% speed in order to maintain those high boost clocks though...


----------



## Bennimaru

Vapochilled said:


> Actually, it would be nice if we could build a undervolt table on the 1st page.
> For example, i can do [email protected] or something like, but i guess someone wrote 2000 at 0.96 ?
> The lowest volts we can achieve with stability, the more we can take from TDP.
> When people here wrote they saw drops to 1600 and 1700mhz during Timespy @ 4k, thats because it pulls a lot of TDP under 4k on timespy.
> Unlesss you have a custom curve with this 0.875 at 1880 then 0.9 at 1900 and [email protected], you will see those drops.
> Because under 4k testing on timespy, with just [email protected], i see 360W pulled.
> for example, when i am on COD, at 4k, inside a building (easy rendering), i see [email protected] taking just 240W
> 
> It really depends on the scenario and resolution.
> Could we do this table?


This is my curve for now, trying to improve it as much as i can, i dont have much time atm.

Asus 3080 TUF non-OC, this curve and +1200 on memories.

This is my last score on Time Spy, cant wait for my waterblock to arrive late October XD



https://www.3dmark.com/3dm/51460621?


----------



## nievz

Vapochilled said:


> To unbrick you can either use the HDMI port of the motherboard (that used the CPU integrated VGA) or, you can do it like me, and do it all by your own mind under a black screen ! hahahhaa
> I reboot the pc, waited 20secs, inserted my windows password, then windows key + cmd + right arrow, and down arrow to open cmd as admin. Then i did cd c:\3080. nvflash64.exe --protectoff
> I could hear the windows sound. Meaning the cmd worked. Whenever you do the protectoff there is a windows sound and screen flicks.
> After that i did nvflash64.exe -6 gigabyte_gaming_oc.rom
> Waited 10sec. Press Y
> Waited 10sec. Press Y
> 
> Unbrick sucess and all done with black screen hahahahah


Nice! I have to practice this bro hahaha it'll probably come in handy some day.


----------



## Vapochilled

Bennimaru said:


> This is my curve for now, trying to improve it as much as i can, i dont have much time atm.
> 
> Asus 3080 TUF non-OC, this curve and +1200 on memories.
> 
> This is my last score on Time Spy, cant wait for my waterblock to arrive late October XD
> 
> 
> 
> https://www.3dmark.com/3dm/51460621?
> 
> 
> 
> View attachment 2461463



Wow ! Thats relly agressive. 2000Mhz on 0.9v only? abd [email protected]? 
Can you play 4k games with that curve? Seem really low... or you have a golden chip


----------



## zotaclove

I'm using Jotac rtx3080. I downloaded msi bios by mistake. So I looked for my original bios, but I couldn't find them. I tried using all the other companies' bios in a hurry, but there was nothing satisfactory. If anybody Jotac rtx3080 Bios, please send it to me. thanks.


----------



## Keninishna

zotaclove said:


> I'm using Jotac rtx3080. I downloaded msi bios by mistake. So I looked for my original bios, but I couldn't find them. I tried using all the other companies' bios in a hurry, but there was nothing satisfactory. If anybody Jotac rtx3080 Bios, please send it to me. thanks.


Here is my stock non oc Jotac 3080 bin jotac.zip

If you find a better than stock bios to use let me know.


----------



## zotaclove

t


Keninishna said:


> Here is my stock non oc Jotac 3080 bin jotac.zip
> 
> If you find a better than stock bios to use let me know.


realy thanks. I've done it because of you. I hope you're full of good things.


----------



## Taggen86

Does anyone know the differences between the PCB and the VRMs for the gaming x trio 3090 compared to the 3080? If they are very similar, then I assume that it is relatively safe to flash the 3080 with the EVGA bios raising the power limit to 400w for daily gaming. I really dont care about the power consumption of my card if i can raise performance, but if I might brake the card I wont flash it for daily gaming.


----------



## spajdr

First test +40 GPU / +400 VRAM / GB Eagle OC 
https://www.3dmark.com/spy/14446374
Did I hit power limit on second test? compared to others I get -20 fps?


----------



## Chrisch

Bennimaru said:


> This is my curve for now, trying to improve it as much as i can, i dont have much time atm.
> 
> Asus 3080 TUF non-OC, this curve and +1200 on memories.
> 
> This is my last score on Time Spy, cant wait for my waterblock to arrive late October XD
> 
> 
> 
> https://www.3dmark.com/3dm/51460621?
> 
> 
> 
> View attachment 2461463


nice card what u got! 

i need a high powerlimit with my TUF, 355W is maximum 



https://www.3dmark.com/3dm/51464600?



crap backup CPU


----------



## meecoooool

I was only able to go +100 GPU +1000 VRAM on my FTW3 w/ strix oc bios.

It can go above +1000 VRAM but it only diminishes my scores on Superposition.


----------



## Vapochilled

Running the normal timespy im CPU bound with my 6700k. - 18878 score. You can see the 362W power. This is Gaming OC bios on top of Eagle OC






















*Now if i run The extreme ( i play at 5K,) the difference to new steps is around 5% only.*











I guess most here with 9900k and EVGA or Asus (better cards than Eagle OC gigabyte), are getting 9700 score. Im close to 9500 in graphics, so, for me, there is no reason for upgrading at 5k ultra


----------



## DStealth

Can you please share this instead of wrong interpretations.
Go here C:\Program Files\NVIDIA Corporation\NVSMI and run .\nvidia-smi.exe -q -d power
Should look like this:


----------



## martinhal

cstkl1 said:


> View attachment 2461458
> 
> 
> 8 hrs of driving. to god knows what forsaken place
> 
> got it
> View attachment 2461459


 That's awesome, road trip to get a gpu.I would drive 12 hours if that is all I would have to do to get a card.


----------



## Alemancio

QQ for you guys that have tested a lot now.

How come there are cheap cards that reach ~2000 MHz with only ~340W and there are cards with a limit of ~420W that also only reach ~2025MHz?

The way I usually understand it is that additional power allows for higher speed (the same way as say tires on a car) but the GPU would also have to be able to even reach those speeds (say the motor) but what I dont understand is that at both cases (cheap and expensive cards) they even use different Watts for the same speed, thats where I get confused? (Strix boosted at 2025Mhz has used 400W but the TUF much less?)


----------



## Vapochilled

Alemancio said:


> QQ for you guys that have tested a lot now.
> 
> How come there are cheap cards that reach ~2000 MHz with only ~340W and there are cards with a limit of ~420W that also only reach ~2025MHz?
> 
> The way I usually understand it is that additional power allows for higher speed (the same way as say tires on a car) but the GPU would also have to be able to even reach those speeds (say the motor) but what I dont understand is that at both cases (cheap and expensive cards) they even use different Watts for the same speed, thats where I get confused? (Strix boosted at 2025Mhz has used 400W but the TUF much less?)



Several times things into play there.
Is the test the same? Tdp and boost clocks all change based on the resolution. Ex. Timespy vs timespy extreme

Other thing is, the chip is already pushed to it's max efficiency at 1900mhz and 0.9v I guess. 
Moving from 320w to 420w, means 33% more power, for only 6 % more clocks


----------



## Shadowdane

Do we know info on the Gigabyte 3080 Vision OC card yet?? Is it the same as the 3080 Gaming OC card?








GeForce RTX™ 3080 VISION OC 10G (rev. 1.0) Key Features | Graphics Card - GIGABYTE Global


Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!




www.gigabyte.com


----------



## nycgtr

Made it here before the 3090 club. Oh well. Need blocks asap. I will have BP and EK ones inbound shortly.


----------



## Khello

I´m new here.
I uploaded the Aorus Master bios to the website.

and here is a direct link if some one needs it.

Enjoy.

Aorus Master 3080 Bios





Filebin :: bin 2zy7x3dx1tg945ou


Upload files and make them available for your friends. Think of it as Pastebin for files. Registration is not required. Large files are supported.




filebin.net


----------



## equlizer34

vigorito said:


> Any owners of msi ventus model,how are you satisfy overall,2100mhz possible?


I can get 2080 at random times but not for long then it goes back down. I have not changed the voltage though.


----------



## Alemancio

Vapochilled said:


> Several times things into play there.
> Is the test the same? Tdp and boost clocks all change based on the resolution. Ex. Timespy vs timespy extreme
> 
> Other thing is, the chip is already pushed to it's max efficiency at 1900mhz and 0.9v I guess.
> Moving from 320w to 420w, means 33% more power, for only 6 % more clocks


What's the additional worth from going Asus TUF to Strix OC if both reach ~2000Mhz?


----------



## Vaesauce

Finally broke into the 12300s on Port Royal on my Aorus 3080 Master.


https://www.3dmark.com/3dm/51476482?



I'm not sure if there is any other BIOS worth trying over the Aorus one. That said, my Board Power Draw on GPU-Z is showing only a max of 358.6W.

I think with some more custom curve tuning, I might be able to breach 12400... maybe lol.


----------



## shiokarai

Vaesauce said:


> Finally broke into the 12300s on Port Royal on my Aorus 3080 Master.
> 
> 
> https://www.3dmark.com/3dm/51476482?
> 
> 
> 
> I'm not sure if there is any other BIOS worth trying over the Aorus one. That said, my Board Power Draw on GPU-Z is showing only a max of 358.6W.
> 
> I think with some more custom curve tuning, I might be able to breach 12400... maybe lol.



I've got few points more with my zotac trinity @ 320w + 9900 ks 5.4GHz (card watercooled).


----------



## shiokarai

Alemancio said:


> QQ for you guys that have tested a lot now.
> 
> How come there are cheap cards that reach ~2000 MHz with only ~340W and there are cards with a limit of ~420W that also only reach ~2025MHz?
> 
> The way I usually understand it is that additional power allows for higher speed (the same way as say tires on a car) but the GPU would also have to be able to even reach those speeds (say the motor) but what I dont understand is that at both cases (cheap and expensive cards) they even use different Watts for the same speed, thats where I get confused? (Strix boosted at 2025Mhz has used 400W but the TUF much less?)


Two words: Silicon Lottery


----------



## spajdr

Ok, fixed my low GPU score at Graphic test 2 
GPU +70 / VRAM 800 (Gaming OC bios)
Graphics score 18 595
Graphics test 1 122.49 FPS
Graphics test 2 105.62 FPS


http://www.3dmark.com/spy/14453619



Going to test more later


----------



## freejak13

shiokarai said:


> I've got few points more with my zotac trinity @ 320w + 9900 ks 5.4GHz (card watercooled).


That's one helluva zotac you have. So much for all the crap people were saying about them.


----------



## Vapochilled

Vaesauce said:


> Finally broke into the 12300s on Port Royal on my Aorus 3080 Master.
> 
> 
> https://www.3dmark.com/3dm/51476482?
> 
> 
> 
> I'm not sure if there is any other BIOS worth trying over the Aorus one. That said, my Board Power Draw on GPU-Z is showing only a max of 358.6W.
> 
> I think with some more custom curve tuning, I might be able to breach 12400... maybe lol.



Weird. I can draw 360w with gaming oc bios... And gaming oc should have a lower 10w tdp


----------



## Bennimaru

Vapochilled said:


> Wow ! Thats relly agressive. 2000Mhz on 0.9v only? abd [email protected]?
> Can you play 4k games with that curve? Seem really low... or you have a golden chip


i've been playing @4k Shadow of the Tomb Raider and RDR2 for a while and the card just runs smooth, i'm really surprised myself, i've never been so lucky with chips lately with 9xx and 10xx XD


----------



## Vaesauce

Vapochilled said:


> Weird. I can draw 360w with gaming oc bios... And gaming oc should have a lower 10w tdp


Weird indeeeeed.
I've decided to start Undervolting... and my scores are actually... getting close to my top Overclocking haha.


----------



## acoustic

Secured my FTW3 Ultra today. Running benches now.


----------



## MrBridgeSix

Eagle OC with the Gaming OC BIOS and custom V/F curve:

Time Spy Extreme: https://www.3dmark.com/3dm/51485396?


----------



## acoustic

Seems I have a dud for mem OC. +500mem max .. 550 gives me performance drops.

Max I found on stock cooler @ 100% fan w/ case fans @ 100% and AC on .. +500mem/+100core. Temps don't go above 42c in Port Royal.

Pulled a 12583 in Port Royal. Not too shabby.


----------



## spajdr

MrBridgeSix said:


> Eagle OC with the Gaming OC BIOS and custom V/F curve:
> 
> Time Spy Extreme: https://www.3dmark.com/3dm/51485396?


The result is hidden and will not be shown for example on leaderboards or search.


----------



## freejak13

acoustic said:


> Seems I have a dud for mem OC. +500mem max .. 550 gives me performance drops.
> 
> Max I found on stock cooler @ 100% fan w/ case fans @ 100% and AC on .. +500mem/+100core. Temps don't go above 42c in Port Royal.
> 
> Pulled a 12583 in Port Royal. Not too shabby.


Max I was able to hit was 12300 in port royal but my room is pretty warm and don't have case fans pointed at the gpu. I think with water cooling this thing will scream.


----------



## MrBridgeSix

spajdr said:


> The result is hidden and will not be shown for example on leaderboards or search.


I think it is because I use an ES Processor.


----------



## NeeDforKill

Hello, today i got my Gigabyte Gaming OC. Any ideas, why it no going more than 350w and in tools like msi afterburner/EVGA X1 i have target only 100%?


----------



## MrBridgeSix

NeeDforKill said:


> Hello, today i got my Gigabyte Gaming OC. Any ideas, why it no going more than 350w and in tools like msi afterburner/EVGA X1 i have target only 100%?


100% is 370W, typical power draw under load is 350~360W, but when using RT + DLSS you can see it peaking at 370W.


----------



## Celeras

My board draw tops out at ~330W with the eVGA XC3 Ultra. Power limit is all the way up and GPUZ says the maximum should be 366W. 

Is this the problem people were referring to earlier in the thread?


----------



## dr.Rafi

Alemancio said:


> QQ for you guys that have tested a lot now.
> 
> How come there are cheap cards that reach ~2000 MHz with only ~340W and there are cards with a limit of ~420W that also only reach ~2025MHz?
> 
> The way I usually understand it is that additional power allows for higher speed (the same way as say tires on a car) but the GPU would also have to be able to even reach those speeds (say the motor) but what I dont understand is that at both cases (cheap and expensive cards) they even use different Watts for the same speed, thats where I get confused? (Strix boosted at 2025Mhz has used 400W but the TUF much less?)


Its all marketing expensive or cheap dont change anything, i have 3080 ventus flashed with asus bios which pull 400 watt but never clock good like gigabyte gaming oc bios it use only 320 watt and clock better and got 400 points more score than asus bios, both cases i overclocked the card to maximum stable clock which was 1920 in msi after burner and gpu z, and giving me 1950 to 2000 mhz clock on screen display during the bench mark .


----------



## dr.Rafi

cstkl1 said:


> View attachment 2461458
> 
> 
> 8 hrs of driving. to god knows what forsaken place
> 
> got it
> View attachment 2461459


Can you please share the bios for this card using NVflash. thanks


----------



## dr.Rafi

nycgtr said:


> Made it here before the 3090 club. Oh well. Need blocks asap. I will have BP and EK ones inbound shortly.
> View attachment 2461497


Can you share the bios for this card please.thanks


----------



## Talon2016

Celeras said:


> My board draw tops out at ~330W with the eVGA XC3 Ultra. Power limit is all the way up and GPUZ says the maximum should be 366W.
> 
> Is this the problem people were referring to earlier in the thread?


Yes this is exactly the problem we've been talking about. EVGA won't acknowledge the issue. The card is garbage with that low power limit, especially for 4K gaming which causes lower than stock boost clocks under certain gaming conditions.


----------



## Celeras

Talon2016 said:


> Yes this is exactly the problem we've been talking about. EVGA won't acknowledge the issue. The card is garbage with that low power limit, especially for 4K gaming which causes lower than stock boost clocks under certain gaming conditions.


Bummer. Though I'm sure someone smart will come out with a BIOS that does for the two pin what the shared version for the three pin FTWs.


----------



## cstkl1

dr.Rafi said:


> Can you please share the bios for this card using NVflash. thanks


btw another friend has both tuf and strix oc also

he is doing a review and will be doing one with wc

also pics of size diff etc on a mobo






NVIDIA GeForce Community V19


Nvidia (NASDAQ: NVDA; /?n'v?di?/ in-vid-ee-?) is an American global technology company based in Santa Clara, California. The company invented the graphics processing unit (GPU) in 1999. GPUs drive the computer graphics in games and in applications used by professional designers. Their parallel...




forum.lowyat.net










NVIDIA Graphics Card Overclocking V1


Image credits NVIDIAThis is a thread to show off overclocking of NVIDIA graphics card OC. Please share your benchmark, stability, overclocking tips & guides etc...======================================================================Introductions:NVDIA...




forum.lowyat.net










NVIDIA GeForce Community V19


Nvidia (NASDAQ: NVDA; /?n'v?di?/ in-vid-ee-?) is an American global technology company based in Santa Clara, California. The company invented the graphics processing unit (GPU) in 1999. GPUs drive the computer graphics in games and in applications used by professional designers. Their parallel...




forum.lowyat.net





will be leaving this place on sunday so will upload the bioses evening unless
@owikh84 does it.

bits wb for me should be by end of next week.


----------



## KenjiS

So uh, Question

How big of a difference does it make if ive been ocing using the Normal bios and not the OC one? Because apparently ive been doing that....

I feel very stupid for not checking the switch until now...


----------



## VPII

I'l be honest, I have tried several bioses on my Palit GamingPro OC and they were as follow:
Asus Tuf - worked but performance not that great
Asus Tuf OC - worked but performance not that great
Asus Strix OC - worked but performance not that great and power draw through the roof
EVGA XC3 - worked but performance not that great
EVGA FTW3 Ultra - worked but performance not that great and power draw through the roof
Gigabyte OC - worked but performance not that great
Gigabyte Aorus Master - performance at stock was actually pretty great but the moment I OC it came crashing down

So I have not tried a single MSI bios as yet but will do so when I geta decent one to try.

The problem is not matter which bios I try my clocks will drop at 320watt and above and this even at the normal Palit bios with the 9% increase for 350watt. I've been in contact with Palit regarding this and they told me that the limit is 2 x 8pin for 2 x 150wat and PCIe gives 66watt....he he he. But I'll await their response regarding the fact that even when I increase the power limit to 350watt I still drop clocks at 320watt.


----------



## dr.Rafi

VPII said:


> I'l be honest, I have tried several bioses on my Palit GamingPro OC and they were as follow:
> Asus Tuf - worked but performance not that great
> Asus Tuf OC - worked but performance not that great
> Asus Strix OC - worked but performance not that great and power draw through the roof
> EVGA XC3 - worked but performance not that great
> EVGA FTW3 Ultra - worked but performance not that great and power draw through the roof
> Gigabyte OC - worked but performance not that great
> Gigabyte Aorus Master - performance at stock was actually pretty great but the moment I OC it came crashing down
> 
> So I have not tried a single MSI bios as yet but will do so when I geta decent one to try.
> 
> The problem is not matter which bios I try my clocks will drop at 320watt and above and this even at the normal Palit bios with the 9% increase for 350watt. I've been in contact with Palit regarding this and they told me that the limit is 2 x 8pin for 2 x 150wat and PCIe gives 66watt....he he he. But I'll await their response regarding the fact that even when I increase the power limit to 350watt I still drop clocks at 320watt.


Same even iam using Ax1600i corsair power supply.


----------



## hemon

Alemancio said:


> What's the additional worth from going Asus TUF to Strix OC if both reach ~2000Mhz?


I really would like to know this too, someone can answer the question? I mean, the main difference seems to be the power: 375W vs 450W. What does such a difference practically means?


----------



## VPII

dr.Rafi said:


> Same even iam using Ax1600i corsair power supply.


Yes I have the Corsair HX1200 psu so with 40 amps on each 12v rail going to the GPU you effectively able to push 480watt per 12 volt rail so in effect 960watt full power to the gpu. But that is theoretical so maybe drop 15 to 25% to play it save.

I am actually so happy, look I only managed 12k n Port Royal once with this run.


https://www.3dmark.com/pr/325939



But look at the stated clock speed vs the average clock speed. I am not able to do it again at present, might have been temps or so.

But this morning I fooled around with my vcurve and wanted to see what I can manage with a +135mhz core effectively 2190mhz core max and +750mhz memory. Now I am happy as this was the closest I got to 2000mhz average clock speed during the run. Now you need to understand I am stuck with 320watt power limit, even though it is increased to 320watt. I tested with and without the increase in power limit and with no increase the max power consumption would be around 325watt, with the 9% power increase it would draw around 341watt so still below the 350watt mark. But here is the result of this morning.



https://www.3dmark.com/pr/385431


----------



## Alemancio

dr.Rafi said:


> Its all marketing expensive or cheap dont change anything, i have 3080 ventus flashed with asus bios which pull 400 watt but never clock good like gigabyte gaming oc bios it use only 320 watt and clock better and got 400 points more score than asus bios, both cases i overclocked the card to maximum stable clock which was 1920 in msi after burner and gpu z, and giving me 1950 to 2000 mhz clock on screen display during the bench mark .


Thanks, that's what I suspected... that almost all 3080 are 97% equal.


----------



## Blotto80

Here's my best Timespy run on my FE so far that's just with straight offset of +180core and +450mem. Playing with some lower voltage steps in the curve editor, I can hold 1920mhz at .850v and I'm working on getting 2ghz stable at .900v or .925v. I'm really looking forward to getting a block on this guy. I think there's some headroom in the mem being left on the table due to the high temps of the vRAM on the FE.


----------



## Chrisch

hemon said:


> I really would like to know this too, someone can answer the question? I mean, the main difference seems to be the power: 375W vs 450W. What does such a difference practically means?


in daily use nothing because ampere runs better undervolted (1.8 - 1.9GHz @ 0.85v), but if you want high scores in benchmarks a higher powerlimit gives you more headroom for higher stable clocks.


----------



## djriful

https://www.3dmark.com/pr/370799 stock volt


----------



## ssgwright

if anyone is interested in either a 3080 alphacool waterblock or EK waterblock I have both. I'll sell for $100 shipped. The EK got delayed so I ordered the alpha, come to find out I ended up with an ASUS tuff which these blocks don't fit on... the alpha is opened as I tried to fit it, the EK block will be brand new in box.


----------



## shiokarai

djriful said:


> https://www.3dmark.com/pr/370799 stock volt


Nice chip you have... it seems FE has the best bins!


----------



## Chrisch

i doubt that they actually bin gpus



https://www.3dmark.com/pr/383557



ASUS TUF


----------



## Orlovki

Anyone got the Zotac Trinity OC and could dump the rom with nvflasher?


----------



## spajdr

NeeDforKill said:


> Hello, today i got my Gigabyte Gaming OC. Any ideas, why it no going more than 350w and in tools like msi afterburner/EVGA X1 i have target only 100%?


Download Boundary Raytracing benchmark, disable DLSS, it should load your GPU to 100%.


----------



## Vapochilled

I have the same bios on my Eagle OC and i dont see anything above 357W during timespy extreme.
During gaming, its around 335 ~ 343W
Most people here dont see any 3x pin bios hitting the values they say.
Even TUF rated for 370W i see people around saying 35x max


----------



## freejak13

shiokarai said:


> Nice chip you have... it seems FE has the best bins!


Yeah I'm starting to notice the same pattern. My FE is coming next week so I'll be able to compare with the ftw3 ultra I'm running now.


----------



## locc

Vapochilled said:


> I have the same bios on my Eagle OC and i dont see anything above 357W during timespy extreme.
> During gaming, its around 335 ~ 343W
> Most people here dont see any 3x pin bios hitting the values they say.
> Even TUF rated for 370W i see people around saying 35x max


I've TUF OC and indeed it can only pull 355W. So basically setting power limit to 105% gives that 355W and going all the way 110% gives nothing more. I've tried several programs and the result is always the same. Below is shot from nVidia's own tool and it shows 375W max but something is holding it back in reality.


----------



## VPII

locc said:


> I've TUF OC and indeed it can only pull 355W. So basically setting power limit to 105% gives that 355W and going all the way 110% gives nothing more. I've tried several programs and the result is always the same. Below is shot from nVidia's own tool and it shows 375W max but something is holding it back in reality.
> View attachment 2461540


I've said it once, and I'll say it again. I do think that these cards are hardware locked at 320watt for clock speed and reaching 320watt or above would drop the clocks. Not sure if so for your Tuf OC as it has a 340watt base power limit, but it is as such for my Palit GamingPro OC.


----------



## dev1ance

Can anyone with a Strix or Aorus card (even Master) measure clock frequency fluctuations over time?

I'm curious on how things are stacking up with power phase designs and how the frequency is fluctuating.

This is a 22+4 power phase card and the clocks are insanely stable (2085 OC, 1955 stock but it simply steps down without constant jumps), Colorful Vulcan:












TUF OC:










Gigabyte Gaming OC:


----------



## martinhal

VPII said:


> I've said it once, and I'll say it again. I do think that these cards are hardware locked at 320watt for clock speed and reaching 320watt or above would drop the clocks. Not sure if so for your Tuf OC as it has a 340watt base power limit, but it is as such for my Palit GamingPro OC.


How does the power limit affect gaming ? I have a Palit coming in and an option on a more expensive card. Would the extra 5K be worth it outside epeen ?


----------



## VPII

martinhal said:


> How does the power limit affect gaming ? I have a Palit coming in and an option on a more expensive card. Would the extra 5K be worth it outside epeen ?


Nope, not really in all honesty. Gamings is perfect and fast


----------



## martinhal

That's what I thought but once one sees three power connectors and flashing rgb one tends to become stupid. But that's the hobby. I was looking here for gaming benchmarks but all I see is bios flashing , shunt mods and people breaking power connectors


----------



## XxXSpitfireXxX

I was able to get a Gigabyte 3080 Gaming OC, a miracle in Canada at this time to get any 3080 at non-scalper bot prices.

Tested the card with the stock bios, +95 Core +700 mem is the max before seeing score degradation. Here’s the score:



https://www.3dmark.com/spy/14461044



I’m looking to try a new bios but going through the discussion I see that there are few upgrades for the Gaming OC on a 2x8 pin except the Aorus Master which caused a black screen for another user. Anyone else try the Aorus Master bios on the Gigabyte Gaming OC?


----------



## spajdr

@XxXSpitfireXxX seems he have similar score 


https://www.3dmark.com/compare/spy/14461044/spy/14466541


I didn't push Core higher than +95 for now, gonna try it again. My VRAM also ends at 700, anything more and fps degrades.


----------



## zhrooms

martinhal said:


> That's what I thought but once one sees three power connectors and flashing rgb one tends to become stupid. But that's the hobby. I was looking here for gaming benchmarks but all I see is bios flashing , shunt mods and people breaking power connectors


Power Limit is the *ONLY* thing that matters, 450W Strix on Air with lowest fan speed and 85°C will still run a higher clock than $1000 water cooled 320W Ventus at 35°C.

Works the same as on Turing, nothing has changed, VRM means absolutely nothing for the average consumer, more power stages are for *LN2 overclocking exclusively. *It's just marketing, each additional power stage costs next to nothing, especially this time when all cards use double PWM controllers regardless.

16 vs 20 Power Stages = Slighly lower VRM temps, doesn't affect overclocking or stability. Most people still don't know that cards like FTW3 and Strix are LN2 cards, VRM built to handle 1000W. The stock 16 Stage VRM can do 600W safely, and most cards are limited to under 400W, it's all a big marketing farce basically.









How NVIDIA neutered the RTX 2080 Ti with intrusive power...


By zhrooms with help from Will Verduzco Last updated: August 30, 2020 This is an extension of the RTX 2080 Ti Owner's Club thread INTRO If you own an RTX 2080 Ti, you are likely missing out on a significant amount of performance, for no good reason other than arbitrary limitation by...




www.overclock.net





2080 Ti was severely power limited already, at just 1.025V (40°C) I'm reaching 385W in some more intensive games, as demonstrated in the link above, utilizing the full 1.093V that NVIDIA has allowed you to use, power consumption reaches close to 500W in Metro Exodus. Power consumption is insanely high on 3090 because of the 24 memory modules and 20% CUDA Core increase, the Strix 480W power limit is not nearly enough, as an example, shunt modding it (raising the power limit) should make Time Spy consume up to 700W at the full voltage.

Sure, efficiency goes out the window, but if you care about efficiency you should just buy the cheapest card and downvolt it so it never consumes more than 300W.

Check the original post at the bottom for a table of Price Per Watt, and you see which cards has great and horrible value such as the very expensive AORUS Master, costs $60 more than FTW3 but 20W lower power limit, and if you flash the FTW3 with Strix BIOS you get 450W, then it's 70W higher power limit at $60 lower (3x8-Pin BIOSes won't work on 2x8-Pin cards, so 380W is the most you're gonna get on 2x8-Pin, maybe if we get XOC in the future, but doubtful because it's 3080 cards, every overclocker will use 3090s and that's where all the custom BIOSes will appear).


----------



## XxXSpitfireXxX

spajdr said:


> @XxXSpitfireXxX seems he have similar score
> 
> 
> https://www.3dmark.com/compare/spy/14461044/spy/14466541
> 
> 
> I didn't push Core higher than +95 for now, gonna try it again. My VRAM also ends at 700, anything more and fps degrades.


Your temps and avg core clock are better, are you running 100% fan speed? If not how are your temps so low?


----------



## spajdr

I have it set to 80% fan speed for benchmarks, otherwise I leave it at AUTO


----------



## Nizzen

dev1ance said:


> Can anyone with a Strix or Aorus card (even Master) measure clock frequency fluctuations over time?
> 
> I'm curious on how things are stacking up with power phase designs and how the frequency is fluctuating.
> 
> This is a 22+4 power phase card and the clocks are insanely stable (2085 OC, 1955 stock but it simply steps down without constant jumps), Colorful Vulcan:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> TUF OC:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gigabyte Gaming OC:


Looks like a Colorful sponsored test 

You got a link to the "review"


----------



## mouacyk

dev1ance said:


> Can anyone with a Strix or Aorus card (even Master) measure clock frequency fluctuations over time?
> 
> I'm curious on how things are stacking up with power phase designs and how the frequency is fluctuating.
> 
> This is a 22+4 power phase card and the clocks are insanely stable (2085 OC, 1955 stock but it simply steps down without constant jumps), Colorful Vulcan:
> 
> 
> 
> 
> 
> 
> 
> 
> TUF OC:
> Gigabyte Gaming OC:


Is it possible reviewer left vsync on and the test just never happened to power throttle?


----------



## Zeakie

My zotac oc is getting 11900 in port royal 100%fan speed 190+ core 550+mem nothing else touched...


----------



## spajdr

Zeakie said:


> My zotac oc is getting 11900 in port royal 100%fan speed 190+ core 550+mem nothing else touched...


More or less we all have similar results 


https://www.3dmark.com/pr/387095


----------



## Mad Pistol

Got a 3080 FE on the way. Can't wait!


----------



## Zeakie

spajdr said:


> More or less we all have similar results
> 
> 
> https://www.3dmark.com/pr/387095


go to see the zotac in and around the same ballpark as the rest of em havent tried flashing anything yet as someone on here said its not really worth it on zotac yet? hopefully the amp bios fixes that


----------



## spajdr

Zeakie said:


> go to see the zotac in and around the same ballpark as the rest of em havent tried flashing anything yet as someone on here said its not really worth it on zotac yet? hopefully the amp bios fixes that


I don't think it's worth it, you have some decent OC already.


----------



## pewpewlazer

Nizzen said:


> Looks like a Colorful sponsored test
> 
> You got a link to the "review"


Colorful card: tested on 456.55 drivers aka the "crashing fix/boost stability improvement" driver
ASUS and Gigabyte cards: tested on 456.16 drivers aka the ones that began a SP-cap/MLCC cap witch hunt due to highly erratic boost clocks.

Real fair comparison... definitely not bias at all...

Not to mention letting any card bounce off the power limit constantly is just lazy overclocking. Even on the old drivers with a low power limit, any card tested should produce similarly stable clock speeds if properly overclocked (read: find the highest voltage it can run without smashing power limit every 2 seconds, not just applying an offset).


----------



## spajdr

Vapochilled said:


> I have the same bios on my Eagle OC and i dont see anything above 357W during timespy extreme.
> During gaming, its around 335 ~ 343W
> Most people here dont see any 3x pin bios hitting the values they say.
> Even TUF rated for 370W i see people around saying 35x max


I ran now 3D Mark Time Spy with HWINFO64 on the background and highest I can see is 360.5W
+120 GPU / 700 VRAM


----------



## Daepilin

Strix OC: Clocks are not quite as stable under full OC (+140/1250) as with that colorful, in timespy the move alittle bit, but not nearly as much as the TUF OC in your example, and I'd say 85-90% of the time my card can keep 2115-2130 in normal timespy.

Haven't done many gaming benchmarks, but in borderlands 3:

The oc run (450W, +140/1250): started at 2115, quickly down to 2100 before steping down 1 more to 2085, I think because of heat (70+°C): 115 FPS
UV run (~290W, [email protected], memory + 500): of course very stable 1935Mhz: 108 FPS

crazy how badly those things scale, and mine isn't even a great chip (decent or okay, but I've seen TUF OC go higher with less power, or some cards hitting 1905Mhz on 0.8v)...


----------



## MrBridgeSix

The RTX 3080 has been the most frustrating piece of hardware to overclock that I've ever had, my advice for 24/7 is actually to just discover what your stable frequency at 0.9V is which will reduce power draw, temps and noise while keeping the same perf as stock and perhaps don't even touch VRAM clocks as they don't seem to return that much and are hard to test...

After 8 hours of tweaking my best STABLE (tested in other games) results are:
Average core clock: 1980mhz
Mem: +700
Time Spy Extreme: 9390
Port Royal: 12000

I can get some extra 100 points in each but it crashes in SOTR.

Stock Eagle OC with the Gaming OC BIOS was 9000 and 11700...


----------



## spajdr

My first underclock result

0.800V 1800Mhz (all time) 3D Mark TimeSpy (max is 256W in HWINFO64)

Graphics score *17 293*
Graphics test 1 *113.29 FPS*
Graphics test 2 *98.70 FPS*



http://www.3dmark.com/spy/14478747



vs my maximum

Graphics Score *18955* (max 360W)
Graphics Test 1 *124.71 FPS*
Graphics Test 2 *107.79 FPS*


https://www.3dmark.com/spy/14477935


----------



## Nizzen

nVidia became like Amd Ryzen cpu's. Undervolt that sucker


----------



## Vapochilled

People with gigabyte aorus master, could you check on gpuz or msi AB your max tdp? From those 380 how much could you reach?


----------



## dev1ance

Nizzen said:


> Looks like a Colorful sponsored test
> 
> You got a link to the "review"





mouacyk said:


> Is it possible reviewer left vsync on and the test just never happened to power throttle?


I wonder if it's because of driver differences, the Colorful was only tested recently on the latest 'fixed' drivers. 
They tested the cards separately (the other cards were tested at launch) but every other card seemed to be tested with older drivers which may be the factor then I think.








COLORFUL iGAME 지포스 RTX 3080 Vulcan OC 10GB


▲ COLORFUL iGAME 지포스 RTX 3080 Vulcan OC 10GB 동영상 COLORFUL iG…




quasarzone.com





Other reviews:








ASUS TUF Gaming 지포스 RTX 3080 O10G OC 10GB


▲ ASUS TUF Gaming 지포스 RTX 3080 O10G OC 10GB 동영상 저는 직업 특성상 일반…




quasarzone.com












GIGABYTE 지포스 RTX 3080 Gaming OC 10GB


▲ GIGABYTE 지포스 RTX 3080 Gaming OC 10GB 동영상 그동안 입이 근질근질해서 참기 …




quasarzone.com








Daepilin said:


> Strix OC: Clocks are not quite as stable under full OC (+140/1250) as with that colorful, in timespy the move alittle bit, but not nearly as much as the TUF OC in your example, and I'd say 85-90% of the time my card can keep 2115-2130 in normal timespy.


Thanks! Good to see that it's likely drivers then. Got an Aorus Xtreme on the way and was curious.


----------



## cstkl1

shiokarai said:


> Nice chip you have... it seems FE has the best bins!


what u smoking dude.

you do know tuf all has clock spikes 2175-2220 right
dual bios
extra hdmi
full mlcc caps
silent and low temps
a small rgb line like knight rider 

FE is garbage vs that.


----------



## markuaw1

asdkj1740 said:


> how about those icx temp sensors on pcb? are they still working?


 just flashed it tonight so far so good more stable clocks is nice, sucks that GPU-Z doesn't report the power right.


----------



## munternet

cstkl1 said:


> what u smoking dude.
> 
> you do know tuf all has clock spikes 2175-2220 right
> dual bios
> extra hdmi
> full mlcc caps
> silent and low temps
> a small rgb line like knight rider
> 
> FE is garbage vs that.


I have the 3080 Tuf OC coming (if it ever arrives)
Is this the model you are saying is a decent GPU? 
Also wondering which EK block will fit this? or is there a better option available from New Zealand?
Cheers


----------



## ssgwright

both alphacool and EK have TUF blocks but both aren't available until end of Oct


----------



## cstkl1

munternet said:


> I have the 3080 Tuf OC coming (if it ever arrives)
> Is this the model you are saying is a decent GPU?
> Also wondering which EK block will fit this? or is there a better option available from New Zealand?
> Cheers


decent? its the best 3080 and cheapest. its only been nerfed by power. 

bro. the card temp so low. wc maybe not even worth it

if this card we can do a custom bios that overides that power limit in the inforom..nobody will buy the strix.

thats how good.

see @owikh84
review in lowyat. he doing a proper comparison on a properly tuned 10900k and rams etc not a dumbass reviewer.

strix only reason is for that 3rd 8 pin to wc. other than that the tuf actually perfect.

the current nvflash i think is incomplete. its not flashing the inforom. i suspect its protected.

just like 2080ti strix that later on had a unlimited power bios. tuf will have it soon

bitspower for tuf and strix.

ek suppose to do a strix

@owikh84
first posting already showed
oc bios/pre bin oc is BS.
its asic boosting

the tuf vs strix oc both at stock was 1980 max boost. but strix oc had a higher sustain clock due to power

so DO not pay more for tuf oc if u have a choice between tuf


----------



## VPII

I finally manage to do a Port Royal run where my average clock speed is above 2000mhz.... only just, but I feel that it is pretty great for a card with an effective 320watt power limit. I have already broken the 12K mark running Port Royal but only just. But just thought I'll share this.



https://www.3dmark.com/pr/389233


----------



## Mucho

VPII said:


> I finally manage to do a Port Royal run where my average clock speed is above 2000mhz.... only just, but I feel that it is pretty great for a card with an effective 320watt power limit. I have already broken the 12K mark running Port Royal but only just. But just thought I'll share this.
> 
> 
> 
> https://www.3dmark.com/pr/389233


Please try the the Inno3D X3 bios with 340 PL. It´s also Ref PCB and maybe it changes your PL to 340. On my Palit I get 350W PL with the original Palit bios. With the Inno3D bios my PL goes down to 340W






File-Upload.net - Datei nicht gefunden


Leider konnte Ihre hochgeladene Datei nicht gefunden werden. Laden Sie die Datei neu hoch.



www.file-upload.net


----------



## ssgwright

love my asus tuf here's my benches (non oc version )

Timespy: http://www.3dmark.com/spy/14487383











Port: http://www.3dmark.com/pr/389591


----------



## doubledoubt

Vapochilled said:


> People with gigabyte aorus master, could you check on gpuz or msi AB your max tdp? From those 380 how much could you reach?


I can't go past 350 (according to both GPU-Z and MSI AB) unless I'm doing something wrong... I made the mistake of installing the RGB Fusion which I suspect may have broken the firmware since the LCD doesn't work and no matter what changes I made in the Engine, nothing happens to the GPU. Everything works fine when using MSI AB and can OC, make changes, etc. I have never messed with the GPU firmware so I'm a bit hesitant to go there for now.


----------



## shiokarai

cstkl1 said:


> what u smoking dude.
> 
> you do know tuf all has clock spikes 2175-2220 right
> dual bios
> extra hdmi
> full mlcc caps
> silent and low temps
> a small rgb line like knight rider
> 
> FE is garbage vs that.


So? Still chip quality decides, despite all the extra "goodies". Bad chip with full mlcc caps is still bad chip. It's the lottery and you know it. No, FE isn't garbage lol.


----------



## VPII

Mucho said:


> Please try the the Inno3D X3 bios with 340 PL. It´s also Ref PCB and maybe it changes your PL to 340. On my Palit I get 350W PL with the original Palit bios. With the Inno3D bios my PL goes down to 340W
> 
> 
> 
> 
> 
> 
> File-Upload.net - Datei nicht gefunden
> 
> 
> Leider konnte Ihre hochgeladene Datei nicht gefunden werden. Laden Sie die Datei neu hoch.
> 
> 
> 
> www.file-upload.net


Can you do me a favour and give me the link to the Inno3d bios again. Did not want to download when I tried yesterday.


----------



## asdkj1740

markuaw1 said:


> just flashed it tonight so far so good more stable clocks is nice, sucks that GPU-Z doesn't report the power right.
> 
> 
> 
> 
> View attachment 2461600


very nice, thanks.


----------



## DStealth

VPII said:


> I finally manage to do a Port Royal run where my average clock speed is above 2000mhz.... only just, but I feel that it is pretty great for a card with an effective 320watt power limit. I have already broken the 12K mark running Port Royal but only just. But just thought I'll share this.
> 
> 
> 
> https://www.3dmark.com/pr/389233


Looks like 320w limited with stock PalitOCupdated BIOS and [email protected] gives best score to these cards. Still awaiting the WB and will do shunt mod....


https://www.3dmark.com/compare/pr/389233/pr/375733#


----------



## GTANY

EVGA RTX 3080 FTW3 teardown and disassembly :


----------



## asdkj1740

GTANY said:


> EVGA RTX 3080 FTW3 teardown and disassembly :


900w bios for 3080 with 3*8pin. 

lets make a wish about these techtubers would kindly share/accidentally leak it out.


----------



## VPII

DStealth said:


> Looks like 320w limited with stock PalitOCupdated BIOS and [email protected] gives best score to these cards. Still awaiting the WB and will do shunt mod....
> 
> 
> https://www.3dmark.com/compare/pr/389233/pr/375733#


I've played around with the vcurve but it is somewhat frustrating at times. It is like this card is hardware limited to 320watt as even increasing the power limit by 9% for 350watt does not change the fact that you lose clocks from 320watt onwards. However, your result when not increasing the power limit will be less than with the power limit increased or that is what I found when I tested it.


----------



## pewpewlazer

asdkj1740 said:


> 900w bios for 3080 with 3*8pin.
> 
> lets make a wish about these techtubers would kindly share/accidentally leak it out.


The "semi-disclosure agreement" (where you're allowed to brag about having it, but aren't allowed to give it out) BIOSes are nothing new. YouTube entertainers get a super special BIOS flash, while the rest of us normal folks get to grab a soldering iron.


----------



## Mucho

asdkj1740 said:


> 900w bios for 3080 with 3*8pin.
> 
> lets make a wish about these techtubers would kindly share/accidentally leak it out.


I don`t think they will share the bios. GN and all the other OCers got it from EVGA/Kingpin I assume. I hope maybe there will be tool for changing the PL on every 3080.


----------



## shiokarai

pewpewlazer said:


> The "semi-disclosure agreement" (where you're allowed to brag about having it, but aren't allowed to give it out) BIOSes are nothing new. YouTube entertainers get a super special BIOS flash, while the rest of us normal folks get to grab a soldering iron.


So true. It's funny, because it's basically an ad for the product, but you can't get the product to perform as in this ad, because reasons... Also, GamersNexus are the first to shrug at and condemn every attempt to get cozy with manufacturers, special treatment etc. yet they go, get special recipe BIOS just for them to generate views/clicks and they say nothing about it, not even a word ie. "users should have the access to this BIOS" or something. Disappointing.


----------



## Nizzen

pewpewlazer said:


> The "semi-disclosure agreement" (where you're allowed to brag about having it, but aren't allowed to give it out) BIOSes are nothing new. YouTube entertainers get a super special BIOS flash, while the rest of us normal folks get to grab a soldering iron.


Hall Of Fame should be locked for "special" bioses and "special" hardware that is not public available. My opinion. 
Soldering and LN2 is ok, because it's public


----------



## asdkj1740

pewpewlazer said:


> The "semi-disclosure agreement" (where you're allowed to brag about having it, but aren't allowed to give it out) BIOSes are nothing new. YouTube entertainers get a super special BIOS flash, while the rest of us normal folks get to grab a soldering iron.


i believe everyone here is trying so hard to test their rtx 3000 cards (which is bought by themself with their own money) and taking risks of bricking cards by cross flashing and trying hard mod to get more power limit. dudes here are doing great to help the whole community and having so much fun so that every new comers could join.


thats why i dont watch any of these video started from rtx 2000.




shiokarai said:


> So true. It's funny, because it's basically an ad for the product, but you can't get the product to perform as in this ad, because reasons... Also, GamersNexus are the first to shrug at and condemn every attempt to get cozy with manufacturers, special treatment etc. yet they go, get special recipe BIOS just for them to generate views/clicks and they say nothing about it, not even a word ie. "users should have the access to this BIOS" or something. Disappointing.


ever since nvidia has banned flashing mod bios on pascal cards, none of these big guys has ever spoken out for us. all the risks are taken by end users to try different bios flashed. and now we are facing the most power throttled cards ever.


----------



## Mucho

All those YT heros like GN, J2C, Bitwit, Pauls Hardware don´t say a little word, because they are all paid by the industry. In Germany we use to say: A dog won`t bite the hand that feeds it.


----------



## asdkj1740

Mucho said:


> All those YT heros like GN, J2C, Bitwit, Pauls Hardware don´t say a little word, because they are all paid by the industry. In Germany we use to say: A dog won`t bite the hand that feeds it.


and sometimes techtubers play with amd polaris tweaker/ddr4 oc software which are developed by the community.


----------



## GTANY

Same proverb in France.

AMD has an advantage over Nvidia : on AMD cards, the power limit slider can be moved up to 50 % which removes any power throttle. What a pity that AMD is no longer a viable option in the high-end.Without any pressure, Nvidia and their partners won't modify their shameful power limit policy since the cards sell like hotcakes.


----------



## cstkl1

overpriced garbage

if the tuf had a mod bios. with waterblock. it be the card of the year

the bios has a screwed up curve. saw this on tuf oc vs tuf also. the non oc version better.

so now i need the strix non oc bios .









couldnt run in my main. 1mm clearance. temps shot to 91c

also interesting the card in hwinfo has a vrm temp readout.


----------



## Nizzen

cstkl1 said:


> View attachment 2461620
> 
> 
> overpriced garbage
> 
> if the tuf had a mod bios. with waterblock. it be the card of the year
> 
> the bios has a screwed up curve. saw this on tuf oc vs tuf also. the non oc version better.
> 
> so now i need the strix non oc bios .


Nice build anyway <3


----------



## GTANY

Can 3090 bioses be flashed on 3080 cards ? Indeed, 3090 bioses have higher power limits.


----------



## asdkj1740

GTANY said:


> Can 3090 bioses be flashed on 3080 cards ? Indeed, 3090 bioses have higher power limits.


no


----------



## cstkl1

Nizzen said:


> Nice build anyway <3


tuf boosted higher. this card not so much. the v/f has issues. seriously op garbage. not worth it atm out of the box

i dont like to mod v/f for 24/7.. looking for strix non oc bios
@safedisk you are needed. help.
while at it bro make it 150x3, 75 on pcie..


----------



## cstkl1

just had dinner


tuf vs strix oc. 
tuf default 320w and strix oc is 370
oced is 375 but its actually around 350 and strix default arnd 350 and max arnd 420 but cannot sustain

temps/noise tested both at 320. tuf wins both

strix oc overclocking
has higher sustain clock but lower boost

thats because the binned oc its running @ has higher gpu voltage and very flat curve. its hitting vrel 

so is this worth it. hell NO. 

wc might be able to see the card actually flying

also strix in hwinfo has a vrm temp readout. 

so pretty sure that the v/f on the oced is kindda screwed up and need asjustment. the non oc tuf was better


----------



## asdkj1740

strix series cards has vrm temp sensor for long time.
like 1660ti rog strix o6g, through ITE IT8915FN.


----------



## cstkl1

asdkj1740 said:


> strix series cards has vrm temp sensor for long time.
> like 1660ti rog strix o6g, through ITE IT8915FN.


so how long is long? dont remember seeing this
2080ti strix.. cant remember. must have been later part of hwinfo cause it wasnt there at launch afaik

1080, 1080ti, 1080ti poseidon???
dont remember that but who used hwinfo that time . ??


----------



## acoustic

Back to 3080s..

Seems very hit or miss with mem OC. Some people either get heavy OC mem or capping around 500. Doesn't seem to be much in between.


----------



## cstkl1

acoustic said:


> Back to 3080s..
> 
> Seems very hit or miss with mem OC. Some people either get heavy OC mem or capping around 500. Doesn't seem to be much in between.


ok fair.

so far asus 4 units, two tuf and strix all did 1250 ..

gddr6x was actually designed at 21gbps.

but some failed the qc due to power delivery etc hence the 19gbps variant


----------



## cstkl1

the strix oc has a very linear voltage
its what capping the spike for higher tdp. its running constant 1.04-1.05v

i saw similar behaviour on tuf oc bios.
its not as good as the non oc bios for boosting

so if anybody has the strix non oc bios.

cause atm i think strix oc is a overpriced garbage on air. only reedeming quality is the 3x8 pin.


----------



## acoustic

Interesting. I just built a buddies rig with a 3080 STRIX. Will see where his memory ends up when we OC his PC in a few days. My FTW3 Ultra is definitely not liking more than +500. The Micron Lottery game seems to be very hit or miss.

I will have to check voltages but my FTW3 Ultra @ +100 is not going under 2025 even at 70c with 2hr gaming. Right now I'm fighting with Precision X1, a terribly built piece of software. Same issue that existed with the 2080TI, you cannot control all 3 fans with Afterburner due to EVGA doing some proprietary thing with the fan control API. I see 2 fans spin up but middle fan stays on BIOS stock control. Until I get off the stock cooler, it's kind of annoying to use Afterburner.

I'm going to test with Afterburner today to see how only being able to control 2 of 3 fans affects temps, as I run my fans 1:1. Truthfully just waiting for waterblocks and then I'll flash the Strix OC bios to it. On air, the extra 50w seems wasted right now as thermal throttling is more of a problem than anything else.


----------



## cstkl1

acoustic said:


> Interesting. I just built a buddies rig with a 3080 STRIX. Will see where his memory ends up when we OC his PC in a few days. My FTW3 Ultra is definitely not liking more than +500. The Micron Lottery game seems to be very hit or miss.
> 
> I will have to check voltages but my FTW3 Ultra @ +100 is not going under 2025 even at 70c with 2hr gaming. Right now I'm fighting with Precision X1, a terribly built piece of software. Same issue that existed with the 2080TI, you cannot control all 3 fans with Afterburner due to EVGA doing some proprietary thing with the fan control API. I see 2 fans spin up but middle fan stays on BIOS stock control. Until I get off the stock cooler, it's kind of annoying to use Afterburner.
> 
> I'm going to test with Afterburner today to see how only being able to control 2 of 3 fans affects temps, as I run my fans 1:1. Truthfully just waiting for waterblocks and then I'll flash the Strix OC bios to it. On air, the extra 50w seems wasted right now as thermal throttling is more of a problem than anything else.


btw this is unconfirmed ya

so here tuf oc is sold usd 80-90 more than tuf. tuf was priced same as ref

heard theres no more tuf like ever after this. 

so was this a ploy. as we know all tuf can overclock very good. 

so if its true. what a diabolical bait and reel.. master plan...


----------



## acoustic

It's so hard to get a card that you almost have to take whatever you can get. I had a choice between a STRIX, FTW3 ULTRA, and MSI TRIO. I've had EVGA for a long time and I prefer the customer support+warranty (waterblocks) so grabbed that, but I must say the STRIX is very well built for Ampere. Putting my friends rig together I honestly really liked the STRIX. The higher power limit BIOS is nice as well, at least for those watercooling as they won't need to flash a card now. 

I thought for Turing that EVGA had the superior card for sure. My 2080TI FTW3 was a complete animal on just a small, stupid 120mm hybrid cooler. 2175 under 45c and +1300mem. I ran it at 2100/8000 for 24/7 use and it was great.


----------



## cstkl1

acoustic said:


> It's so hard to get a card that you almost have to take whatever you can get. I had a choice between a STRIX, FTW3 ULTRA, and MSI TRIO. I've had EVGA for a long time and I prefer the customer support+warranty (waterblocks) so grabbed that, but I must say the STRIX is very well built for Ampere. Putting my friends rig together I honestly really liked the STRIX. The higher power limit BIOS is nice as well, at least for those watercooling as they won't need to flash a card now.
> 
> I thought for Turing that EVGA had the superior card for sure. My 2080TI FTW3 was a complete animal on just a small, stupid 120mm hybrid cooler. 2175 under 45c and +1300mem. I ran it at 2100/8000 for 24/7 use and it was great.


evga was worth it then because king pin came out later

only evga bins card. so.. the rest gets whatever not deemed kingpin. 

other aibs dont bin gpus


----------



## acoustic

I'll be doing some testing today with V/F curve. I don't really care too much for synthetic benching so never bothered much with V/F OCing, and prefer using a simple offset for gaming, but will play around with it just to see since I have some spare time today.


----------



## cstkl1

acoustic said:


> I'll be doing some testing today with V/F curve. I don't really care too much for synthetic benching so never bothered much with V/F OCing, and prefer using a simple offset for gaming, but will play around with it just to see since I have some spare time today.


was talking about stable testing..
mine is 1-2hrs heaven followed by hours of gameplay

the oced card have very high liner v/f which is a problem compared to stock cards. they boost higher on low tgp scenes.


----------



## Daepilin

this leniency with power and heat, as they have the capabilities for both, 'might' also lead to actually slightly worse chips on the strix, as they don't need to be as good to hit their targets in testing.

I've seen tons of TUF handily beating the chip on my strix, while a lot of strix reviews I saw claimed very low OC headroom.

I need to give that card 0.868v for 1905Mhz stable (had it at 850, but that crashed in control, though nowhere else) while I saw lots of tuf reaching that way bellow 0.850. Of course always a lottery, but considering reviewers usually get better cards...

(chip also doesn't max out that nicely, only hitting 2265-2250 in ATI tool low load peaks at 1.087v)


----------



## Mucho

Here is a guy running the Palit OC bios on a Zotac Trinity NON OC.
Everything is working, even RGB


----------



## nycgtr

Worse outta the 2 strix. Took the better one apart, need to ghetto rig a block onto it on z490 test bench. Need to keep it under 50c apparently.


----------



## ThrashZone

nycgtr said:


> Worse outta the 2 strix. Took the better one apart, need to ghetto rig a block onto it on z490 test bench. Need to keep it under 50c apparently.
> 
> View attachment 2461671


Hi,
71c max yeah that's whack


----------



## criminal

3 more days and I will be back home so I can play with my 3080. Sigh... wasted weekend while my 3080 is sitting in the box at home.


----------



## shiokarai

Mucho said:


> Here is a guy running the Palit OC bios on a Zotac Trinity NON OC.
> Everything is working, even RGB
> View attachment 2461662


So the power draw is finally >320w?


----------



## Zeakie

Mucho said:


> Here is a guy running the Palit OC bios on a Zotac Trinity NON OC.
> Everything is working, even RGB
> View attachment 2461662


I have the zotac oc and just flashed the palit oc updated will test and report back


----------



## Zeakie

So before flash my pw slider was stuck on 100%
After flash 109%
Before flash port royal stock 11 213
After flash stock 11 309
After 109% limit everything else stock 11 463
Presume it worked will check what w it goes up to After me bong rip


----------



## vigorito

One question about new egg and 3080,after any model of 3080 appear on stock,does that mean new egg have it and shipping its gonna be in a week or some of you guys are waiting several weeks to get it


----------



## Zeakie

Quick run of furmark 8xmsaa 4k reported over 335w running it for a min with yt in the back so for anyone power limited on zotac stuck on 320w.. palit bios atleast helps a bit there


----------



## criminal

vigorito said:


> One question about new egg and 3080,after any model of 3080 appear on stock,does that mean new egg have it and shipping its gonna be in a week or some of you guys are waiting several weeks to get it


You can't order it on Newegg if it isn't in stock. If you get an order and it doesn't get canceled, you will have it in a week.


----------



## vigorito

Ok thx


----------



## DirtyScrubz

For those of you with a Strix or FTW3, do you think either card would benefit from a WC block and be able to sustain 2100+ MHz in games (e.g. Warzone) at that point? I know the FTW3 has a lower PL but can be flashed which is what I'd do if I managed to get a hold of one.


----------



## Zeakie

Ranked myself 66 with my zotac oc modded to palit oc bios 



https://www.3dmark.com/pr/391033


----------



## acoustic

Cooler the temps, the higher the clock speed. My FTW3 Ultra on stock BIOS holds 2025 @ 70c. When I have it set for benching with the AC on and fans 100%, I'm holding 2100+ pretty much the entire time. It's impressive, but with the AC off and playing games with a 1:1 ratio, temps eventually get up to the high 60s/low 70s.

I don't think you'll sustain 2100+ in games unless you're under 40c with a monstrous watercooling setup with low ambient. At the end of the day, the card is capable of pushing out 450w of heat; that's a lot no matter what way you look at it.

There's also the matter of power limit being a factor, and how good of a chip you have. The highest clocks you can manage at the lowest voltage possible.

I don't think you can go wrong either way. The STRIX and FTW3 are definitely the two cards to grab if you are going under water, imo.


----------



## Daepilin

My Strix is close, dropping maybe to 2085 after some time in games, when it gets to warm, but with offset oc (+140/1250)

Haven't tried much with a curve at high powerlimit, but at 1v it seems to hold 2070 as well (leading to slightly above 400w). I think it should be doable to bring it to/over 2100 with 1.012/1.018/1.025

And as I wrote earlier, the chip is decent but not great (needs 0.868 for stable 1905mhz in games like control)

Waiting for the watercool/aquacomputer block, after I have it I will do more testing, until then uv bellow 300w all the way


----------



## spajdr

With Unigine Superposition (*11804*) on EAGLE card (flashed Gaming OC bios) I got highest GPU Power so far = *363.1 W* (+140 GPU / +775 VRAM)


----------



## acoustic

Daepilin said:


> My Strix is close, dropping maybe to 2085 after some time in games, when it gets to warm, but with offset oc (+140/1250)
> 
> Haven't tried much with a curve at high powerlimit, but at 1v it seems to hold 2070 as well (leading to slightly above 400w). I think it should be doable to bring it to/over 2100 with 1.012/1.018/1.025
> 
> And as I wrote earlier, the chip is decent but not great (needs 0.868 for stable 1905mhz in games like control)
> 
> Waiting for the watercool/aquacomputer block, after I have it I will do more testing, until then uv bellow 300w all the way


You make me really want to try some curve OCing. 2070/2085 is impressive after sustained loads. Memory clock is absurd too, looks like you won the Micron Lottery! What temps are you seeing under sustained loads? As I mentioned earlier with a 1:1 ratio I'm landing in the high 60s/low 70s, bringing me to 2025. I need to get on the PC today but been busy with other things .. also getting off of Precision X1; if it crashes one more time or fails to actually apply the overclock again I'm gonna chuck my monitor at a wall.

Update: Uninstalled PX1, re-installed Afterburner, and now messing with curve. I locked card at 1000mv, stable at +75core = 2025. Definitely appears I did not get a very good chip in comparison, or your STRIX is much better than you think. Drops to 1980 @ 72c running Time Spy Extreme stress-test. I have 1:1 fan curve set, no clue what the 3rd fan is doing as I can't see it in Afterburner nor control it. I think the stock setting is close to 1:1 on the OC BIOS which my card is set to.

Never messed with V/F OCing before; is there a way to adjust the curve faster? Having to drag the 9 million dots after 1000mv is time consuming LOL


----------



## mingocr83

Hello everyone,

Hope you are well,

Im hitting 71C with my ZOTAC Trinity non OC, stock Bios, fans on case have a nice curve til 100% after 50C on CPU, using 3 Noctuas, 2000rpm, 1 exhaust, 2 intake. Since I purchased this card I gradually see temps growing up. Im also using a custom fan curve on firestorm (software is crap BTW)

Anything special I should check?

Thanks!


----------



## disordinary

Hey all,

Aorus Xtreme at stock settings, the highest it's hit so far was 2085 MHz and 67 degrees, it's pretty happy sitting well north of 2000MHz.

Max power draw has been 391W


----------



## nycgtr

DirtyScrubz said:


> For those of you with a Strix or FTW3, do you think either card would benefit from a WC block and be able to sustain 2100+ MHz in games (e.g. Warzone) at that point? I know the FTW3 has a lower PL but can be flashed which is what I'd do if I managed to get a hold of one.


Yes, I have 2 strixs here both will maintain 2130 if under 55c I would say. Around 60-70c I can maintain 2100 on both. I've yet to hit the power limit on either of them. Temps seem to be more the issue.


----------



## KingEngineRevUp

disordinary said:


> Hey all,
> 
> Aorus Xtreme at stock settings, the highest it's hit so far was 2085 MHz and 67 degrees, it's pretty happy sitting well north of 2000MHz.
> 
> Max power draw has been 391W


Can you please upload the bios, you will be the first to do so!


----------



## disordinary

KingEngineRevUp said:


> Can you please upload the bios, you will be the first to do so!


You're going to have to point me at some documentation on how to do that. Google gave me nada.


----------



## Celeras

Best I can do so far with my XC3 Ultra: https://www.3dmark.com/3dm/51574734

+215/1250 for 18825 GPU. That particular run topped out at 2160mhz but I've seen 2190 on a slightly lower score. Not too shabby considering the TDP issues with the BIOS. And I definitely lose a little bit in these benches from the CPU but what're gonna do. I think Ill be able to get 19,000 eventually.


----------



## KingEngineRevUp

disordinary said:


> You're going to have to point me at some documentation on how to do that. Google gave me nada.


On page 1 of this forum, check out the bios flashing section









[Official] NVIDIA RTX 3080 Owner's Club


Last Updated: November 13, 2020 Note: This content is licensed under Creative Commons 3.0. This means that you are free to copy and redistribute this material, but only if the following criteria are met: 1) You must give appropriate credit by linking back to this thread. 2) You may not use this...




www.overclock.net





Just do steps 1-8 and then step 20 to back up your bios. All it'll do is copy your bios out and you can share it here.


----------



## zhrooms

@disordinary has confirmed the "*Gigabyte AORUS RTX 3080 Xtreme 10GB*" power limit, 350W maximum with the Silent BIOS and with the Performance BIOS.. 370

This is a retail sample, nothing weird going on, BIOS confirms it's triple 8-pin Xtreme variant, but *this can't be what Gigabyte had in mind*, the dual 8-pin connector *Master* variant that is cheaper, has a maximum power limit of *380*, so 10 more. Gigabyte *has* to release an update soon because there's no way the cheaper dual connector card is suppose to have a higher power limit.

Proof


----------



## disordinary

As above I don't think this bios is worth sharing, although Aorus engine says it's up to date it seems pretty old and the power limits are low (lower than seen anyway).


----------



## disordinary

@zhrooms gigabyte recommends a 750W power supply for the Master and 850W for the Xtreme so I think you're right, this isn't the bios they intended and they'll probably fix it before this card goes wide.


----------



## acoustic

Yeah, seems like a slip up from Gigabyte. Can you flash the Strix OC bios to it?


----------



## disordinary

acoustic said:


> Yeah, seems like a slip up from Gigabyte. Can you flash the Strix OC bios to it?


Probably, I've created a support ticket with Gigabyte to see if there's an upcoming bios.


----------



## ssgwright

don't matter, just flash to the strix, probably better then what gigabyte will put out anyway.


----------



## hemon

At those with the 3080 Strix: please, can you tell me how much drops your card? 

It is because I'm thinking to change my TUF and for the Strix, so the question is how "much" (FPS at 1440p) the higher power and so stable clock is worth.


----------



## TK421

any recommended bioses to use on FE card?


----------



## Daepilin

acoustic said:


> You make me really want to try some curve OCing. 2070/2085 is impressive after sustained loads. Memory clock is absurd too, looks like you won the Micron Lottery! What temps are you seeing under sustained loads? As I mentioned earlier with a 1:1 ratio I'm landing in the high 60s/low 70s, bringing me to 2025. I need to get on the PC today but been busy with other things .. also getting off of Precision X1; if it crashes one more time or fails to actually apply the overclock again I'm gonna chuck my monitor at a wall


Temps are definitely getting to 70 if going above 400w without fans at 100%,thats why I'm waiting for water to go there further.


----------



## asdkj1740

disordinary said:


> As above I don't think this bios is worth sharing, although Aorus engine says it's up to date it seems pretty old and the power limits are low (lower than seen anyway).


please run furmark 0xaa(off) 1080p for 30mins and take a screenshot containing fan rpm/voltage/watts as well.

this is outrageous, how much for that? master on newegg is 849 but no xtreme.




disordinary said:


> @zhrooms gigabyte recommends a 750W power supply for the Master and 850W for the Xtreme so I think you're right, this isn't the bios they intended and they'll probably fix it before this card goes wide.


you mean you got the reply from gigabyte officials? why that is not intended?


at the time gigabyte 2080ti gaming oc was out and come with super low power limit bios like 260w or 280w and then gigabyte released higher power lmit bios 366w later. but now we are talking 3080 aorus series which is launched much later than the first launch cards like tuf/gaming trio/gaming oc/eagle etc. i dont think it is not intended to ship these aorus cards with 370/380w bios.
we have seen ftw3 and rog already, they are custom design especially the rog one, and come with custom cooling solution and 440/460 power limit bios.

ps. in the livestream of gigabyet aorus, the guy in taiwan hq said they are aiming at 450w. so it seems they are really not intended to ship the cards with 370w. another qc problem after the power cable adaptor on gaming /eagle series cards?




acoustic said:


> I'm not a die-hard fan. I couldn't care less about the guy truthfully, but he has put out some solid videos on PCB construction on various products. Calling someone a dumbass and then trying to justify that claim using something like RTL tuning is amusing. But yeah, sure buddy.. I "don't understand anything," and you are the smartest man alive.


sometims pcb analysis is not equal to actual performance, unless you are not going to press the power button.








ps. same vcore vrm on prime a & proart creator, same vrm controller same mosfet some phase count same controlling scheme (i guess).


----------



## Vapochilled

For stability gaming with Eagle OC -> Gaming OC BIOS, i find the best setup a custom curve:
0.887 - 1905
0.9 - 1920
0.916 - 1935
0.925 - 1950
0.93+ - 1980


----------



## shiokarai

Alphacool Aurora Plexi GPX-N RTX 3090/3080 GPU Water Block Review - How to turn 340 watts on a GeForce RTX 3080 into a frosty zone | igor'sLAB


With the Alphacool GPU water block Aurora Plexi GPX-N RTX 3090/3080 I want to start the new round of GPU water blocks, but this time for Ampere and not Turing. A water cooling system makes sense with…




www.igorslab.de





Igor's review of alphacool reference block confirms it's a great block  I can confirm, too as I have one on my zotac trinity and it fits nicely and cools extremely well (temps 31-33 celsius under full load, 20c ambient)


----------



## disordinary

@*asdkj1740 *it says on the spec sheets on the gigabyte website that the master requires 750W and the xtreme requires 850W. I can't register the card yet either so I think it was made available in my region too early, on reddit someone said that the launch date for the card will be announced on Friday.


----------



## KingEngineRevUp

hemon said:


> At those with the 3080 Strix: please, can you tell me how much drops your card?
> 
> It is because I'm thinking to change my TUF and for the Strix, so the question is how "much" (FPS at 1440p) the higher power and so stable clock is worth.


Assuming you have OC your TUF, it's like 1-2%. Let's say you were playing at 100 FPS, you would get 101 FPS now.


----------



## KingEngineRevUp

ssgwright said:


> don't matter, just flash to the strix, probably better then what gigabyte will put out anyway.


But what of the fan RPMs do not match up? Will that be an issue?


----------



## pazzoide76

ssgwright said:


> love my asus tuf here's my benches (non oc version )
> 
> Timespy: http://www.3dmark.com/spy/14487383
> 
> View attachment 2461605
> 
> 
> 
> Port: http://www.3dmark.com/pr/389591
> 
> View attachment 2461606



Hi,
I have an asus rtx 3080 tuf not oc.
I wanted to ask you if the results you have obtained are in stock or not.
What are your card settings?

Thank you


----------



## VPII

Look it makes me somewhat upset when I see with some of these benchmarks shown the average clock speeds some of these cards get. Look with this Palit I have there is no way you'll get even close to it. I only once managed 2003mhz average clocks in Port Royal but my max clock was bumped all the way up to 2190mhz and I could only do it as the card is so limited for power that it would never reach those clocks as per the link below.


https://www.3dmark.com/pr/389317



Now this card, from where I have tested it in Metro Exodus, Shadow of Tomb Raider and Control at 1080P, it can easily do 2130 to 2145mhz core clock constant without a crash. I've been in contact with Palit, they got back to me with the excuse of 150watt per 8pin plug which can obviously go higher without an issue, but 66watt for the PCIe which we all know is actually 75watt. In my past email to them I showed them what happens when I run my gpu stock with and without the 9% power limit increase. Look in the pick as max power draw.

















Oh and just to add the links, look at the difference in average clocks. 


https://www.3dmark.com/spy/14507709





https://www.3dmark.com/spy/14507938



I mean seriously 13mhz....???


----------



## Vapochilled

VPII said:


> Look it makes me somewhat upset when I see with some of these benchmarks shown the average clock speeds some of these cards get. Look with this Palit I have there is no way you'll get even close to it. I only once managed 2003mhz average clocks in Port Royal but my max clock was bumped all the way up to 2190mhz and I could only do it as the card is so limited for power that it would never reach those clocks as per the link below.
> 
> 
> https://www.3dmark.com/pr/389317
> 
> 
> 
> Now this card, from where I have tested it in Metro Exodus, Shadow of Tomb Raider and Control at 1080P, it can easily do 2130 to 2145mhz core clock constant without a crash. I've been in contact with Palit, they got back to me with the excuse of 150watt per 8pin plug which can obviously go higher without an issue, but 66watt for the PCIe which we all know is actually 75watt. In my past email to them I showed them what happens when I run my gpu stock with and without the 9% power limit increase. Look in the pick as max power draw.
> 
> View attachment 2461722
> View attachment 2461723
> 
> 
> Oh and just to add the links, look at the difference in average clocks.
> 
> 
> https://www.3dmark.com/spy/14507709
> 
> 
> 
> 
> 
> https://www.3dmark.com/spy/14507938
> 
> 
> 
> I mean seriously 13mhz....???



I think you need to double check something there.
I have a 6700k (2015 cpu) and i can get 12200 points in port Royal with Gigabyte Eagle OC
I get that with 20300 in mem and undervolts where the card is at 1980 at 0.94 and 1930 at 0.91

You should undervolt to increase power effiency and reduce the times you hit power limits


----------



## VPII

Vapochilled said:


> I think you need to double check something there.
> I have a 6700k (2015 cpu) and i can get 12200 points in port Royal.
> I get that with 20300 in mem and undervolts where the card is at 1980 at 0.94 and 1930 at 0.91


Which card do you have? Sorry if I missed it. Secondly, I have played with the vcurve like crazy without much luck where 2010mhz is 943 to 950 mv, 2025mhz is 956vm, 204mhz is 962 to 968mv, 2055mhz is 975 to 981mv and 2070mhz is 987 to 993mv and it is perfectly stable. I primarily test in games with 1080P and all bells and whistles going to see how it stays above 2000mhz without failing.


----------



## shiokarai

VPII said:


> Which card do you have? Sorry if I missed it. Secondly, I have played with the vcurve like crazy without much luck where 2010mhz is 943 to 950 mv, 2025mhz is 956vm, 204mhz is 962 to 968mv, 2055mhz is 975 to 981mv and 2070mhz is 987 to 993mv and it is perfectly stable. I primarily test in games with 1080P and all bells and whistles going to see how it stays above 2000mhz without failing.


Shunt mod.


----------



## VPII

shiokarai said:


> Shunt mod.


Sorry but it is not something you try when you bought the card in South Africa as we basically pay 1000 to 1200 USD for the card you pay 700 to 800 USD. Not that I cannot do it, but if I take that I got the card shortly after release, it is actually stupid to strip and solder on it. Besides, they made sure with two stickers on the screws that they'll see if you opened it.


----------



## VPII

If I may ask, can any of you with a higher power limit save your Inforom Firmware to a fine to share with me. The key for nvflash is as follow:
--save <filename>.ifr


----------



## Vapochilled

VPII said:


> Which card do you have? Sorry if I missed it. Secondly, I have played with the vcurve like crazy without much luck where 2010mhz is 943 to 950 mv, 2025mhz is 956vm, 204mhz is 962 to 968mv, 2055mhz is 975 to 981mv and 2070mhz is 987 to 993mv and it is perfectly stable. I primarily test in games with 1080P and all bells and whistles going to see how it stays above 2000mhz without failing.



What about 4k benchs ?
I have Eagle OC with Gaming OC bios
So maybe i could go with higher freqs and lower voltages?
I think that discussion was already taken. The infoRom makes no diference o nthe TDP correct?

BTW, still about Aorus master bios, i heard people saying that Performance was 350W instead of 380 as promissed, and quiet was 370 ???
Anyone tested aorus engine ? To see if the 350W limit instead of 380W limit is due to lack of power slide working ? And maybe it would work with aorus engine?


----------



## VPII

Vapochilled said:


> What about 4k benchs ?
> I have Eagle OC with Gaming OC bios
> So maybe i could go with higher freqs and lower voltages?


You can try it, it may work. CTRL and mouse on the left and increase with15mhz increments. Then CTRL and mouse on right and increase by the same so it is balanced out. When you apply the vcruve it would automatically change which even clock does not fall into the correct slot.


----------



## VPII

Vapochilled said:


> What about 4k benchs ?
> I have Eagle OC with Gaming OC bios
> So maybe i could go with higher freqs and lower voltages?


Oh, if possible as your power limit I think is 370watt, please as I stated two post earlier save your inforom as I said in the post and share it with me so I can try to see if it would work.


----------



## Chrisch

VPII said:


> If I may ask, can any of you with a higher power limit save your Inforom Firmware to a fine to share with me. The key for nvflash is as follow:
> --save <filename>.ifr


not that high, but 375W from ASUS TUF OC





__





TUF.ifr beim Filehorst - filehorst.de






filehorst.de


----------



## VPII

Chrisch said:


> not that high, but 375W from ASUS TUF OC
> 
> 
> 
> 
> 
> __
> 
> 
> 
> 
> 
> TUF.ifr beim Filehorst - filehorst.de
> 
> 
> 
> 
> 
> 
> filehorst.de


Thanks a million


----------



## Vapochilled

VPII said:


> If I may ask, can any of you with a higher power limit save your Inforom Firmware to a fine to share with me. The key for nvflash is as follow:
> --save <filename>.ifr


I can do it later, but, when you flash the gaming OC bios, doesnt this flash the info also? 
I guess it does because when i do the smi.exe -q -d power i see the 370W info instead of the 340W defaults, and as i said i see the 360W hit during 4k benchs.

The point now is, what about Aorus 380W ? 
I guess thats the highest 2x pin TDP...
Anyone tested this with Aorus Engine SW ?


----------



## martin28bln

Hello @VPII - I think we are battling in score rank of 3D Mark . Please inform me when you finally flashed a better Bios for 2x8Pin Ref cards. Problem with my Gainward 3080 is Powerlimit for higher scores. What is actually the best Bios für this Ref.PCB and 2x8pin?

At the moment I check undervolting. Staying acutally @1840Mhz with 0.818V and getting stock performance with -30-50W.


----------



## Adrian76

Hi does anyone have the TUF OC bios please, Apparently it works with the Zotac trinity non OC model, Is this correct and also does the HDMI port work also after flashing?

I don't have a backup card or IGPU on this 9700k so want to make sure there is no hitches in the flashing procedure, Thank you.

Edit: I have the bios now can anyone confirm the HDMI port still works on the Zotac 3080 Trinity with Asus 3080 TUF bios? Thanks.


----------



## DrMorphine

Vapochilled said:


> I can do it later, but, when you flash the gaming OC bios, doesnt this flash the info also?
> I guess it does because when i do the smi.exe -q -d power i see the 370W info instead of the 340W defaults, and as i said i see the 360W hit during 4k benchs.
> 
> The point now is, what about Aorus 380W ?
> I guess thats the highest 2x pin TDP...
> Anyone tested this with Aorus Engine SW ?


It looks like Aorus Master does not have 380 power limit, only 370 W ( atleast mine) - this is on OC BIOS. Normaly in benching it goes up to 345-350 W, only in furmark to 362- 368W. OC is also, not great, generally for bench i could give + 100 core + 750 RAM and it passes, but in games like AC Oddysey only + 35 core + 500 VRAM was stable ( or i am doing something wrong?), even after OC in games and 3d mark it does not go over 350, only in FURMARK. 
Generally if it goes over 2055 MHz in games it crashes to windows. I did not tried the silent bios. Maybe later. I think it is not a good card for OC, it is good on stock (not bad entry oc), but not much headroom. 
Any sugestions? The time spy is after bigger OC.


----------



## Chrisch

Adrian76 said:


> Hi does anyone have the TUF OC bios please, Apparently it works with the Zotac trinity non OC model, Is this correct and also does the HDMI port work also after flashing?
> 
> I don't have a backup card or IGPU on this 9700k so want to make sure there is no hitches in the flashing procedure, Thank you.
> 
> Edit: I have the bios now can anyone confirm the HDMI port still works on the Zotac 3080 Trinity with Asus 3080 TUF bios? Thanks.







__





AMPEREBIOS.rar beim Filehorst - filehorst.de






filehorst.de





Edit: my 3dmark results 

8600K @ 4.5GHz
ASUS 3080 TUF OC @ 2130/10100 (avg. 2050-2090MHz) [air - 355W max PT)

http://www.3dmark.com/fs/23726245









http://www.3dmark.com/fs/23724481









http://www.3dmark.com/fs/23726083









http://www.3dmark.com/spy/14502022









http://www.3dmark.com/spy/14502129









http://www.3dmark.com/pr/391258


----------



## Adrian76

Chrisch said:


> __
> 
> 
> 
> 
> 
> AMPEREBIOS.rar beim Filehorst - filehorst.de
> 
> 
> 
> 
> 
> 
> filehorst.de
> 
> 
> 
> 
> 
> Edit: my 3dmark results
> 
> 8600K @ 4.5GHz
> ASUS 3080 TUF OC @ 2130/10100 (avg. 2050-2090MHz) [air - 355W max PT)
> 
> http://www.3dmark.com/fs/23726245
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/23724481
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/23726083
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/spy/14502022
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/spy/14502129
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/pr/391258



That's great thanks a bunch much appreciated.


----------



## Vapochilled

DrMorphine said:


> It looks like Aorus Master does not have 380 power limit, only 370 W ( atleast mine) - this is on OC BIOS. Normaly in benching it goes up to 345-350 W, only in furmark to 362- 368W. OC is also, not great, generally for bench i could give + 100 core + 750 RAM and it passes, but in games like AC Oddysey only + 35 core + 500 VRAM was stable ( or i am doing something wrong?), even after OC in games and 3d mark it does not go over 350, only in FURMARK.
> Generally if it goes over 2055 MHz in games it crashes to windows. I did not tried the silent bios. Maybe later. I think it is not a good card for OC, it is good on stock (not bad entry oc), but not much headroom.
> Any sugestions? The time spy is after bigger OC.
> View attachment 2461731
> View attachment 2461732


So is that a gigabyte bug ? Who stated they were 380?


----------



## asdkj1740

DrMorphine said:


> It looks like Aorus Master does not have 380 power limit, only 370 W ( atleast mine) - this is on OC BIOS. Normaly in benching it goes up to 345-350 W, only in furmark to 362- 368W. OC is also, not great, generally for bench i could give + 100 core + 750 RAM and it passes, but in games like AC Oddysey only + 35 core + 500 VRAM was stable ( or i am doing something wrong?), even after OC in games and 3d mark it does not go over 350, only in FURMARK.
> Generally if it goes over 2055 MHz in games it crashes to windows. I did not tried the silent bios. Maybe later. I think it is not a good card for OC, it is good on stock (not bad entry oc), but not much headroom.
> Any sugestions? The time spy is after bigger OC.
> View attachment 2461731
> View attachment 2461732


let it run longer pls, like 20mins


----------



## Vaesauce

DrMorphine said:


> It looks like Aorus Master does not have 380 power limit, only 370 W ( atleast mine) - this is on OC BIOS. Normaly in benching it goes up to 345-350 W, only in furmark to 362- 368W. OC is also, not great, generally for bench i could give + 100 core + 750 RAM and it passes, but in games like AC Oddysey only + 35 core + 500 VRAM was stable ( or i am doing something wrong?), even after OC in games and 3d mark it does not go over 350, only in FURMARK.
> Generally if it goes over 2055 MHz in games it crashes to windows. I did not tried the silent bios. Maybe later. I think it is not a good card for OC, it is good on stock (not bad entry oc), but not much headroom.
> Any sugestions? The time spy is after bigger OC.


Sounds like you may have gotten the lower end of the Silicon lottery. My Aorus 3080 Master does perfectly fine with OCs in all benchmarks and games. I'm done with all the benchmarking as I'm sure I've hit a wall (Until updated bios). My Aorus is stable at 1300mem clock but can get into "Corrective" territory there. So I stayed at 1200. In terms of "Core", the highest I could run was 130 stable. But it's definitely better to run a Curve as the card is much more stable UP TOP than it is down bottom. You can practically do +180 up top on the curve up top but in the lower voltages, 100 is prob more ideal.

That said, the Aorus is definitely 370w.

P.S. I've since then undervolted my Aorus hitting Port Royal scores of 12200s stable, and i'm going to keep it there. No point in throwing more voltage at it just for an extra score of 100 for daily usage lol.


----------



## KingEngineRevUp

Vapochilled said:


> I think you need to double check something there.
> I have a 6700k (2015 cpu) and i can get 12200 points in port Royal with Gigabyte Eagle OC
> I get that with 20300 in mem and undervolts where the card is at 1980 at 0.94 and 1930 at 0.91
> 
> You should undervolt to increase power effiency and reduce the times you hit power limits


Not sure if your 6700K is OC to run better than a 3950X, it might be. I just want to mention that it's not a apples to apples comparison with a system running an AMD and a system running a Intel. The Intel machines usually run 3DMark a little better.


----------



## XxXSpitfireXxX

Chrisch said:


> __
> 
> 
> 
> 
> 
> AMPEREBIOS.rar beim Filehorst - filehorst.de
> 
> 
> 
> 
> 
> 
> filehorst.de
> 
> 
> 
> 
> 
> Edit: my 3dmark results
> 
> 8600K @ 4.5GHz
> ASUS 3080 TUF OC @ 2130/10100 (avg. 2050-2090MHz) [air - 355W max PT)
> 
> http://www.3dmark.com/fs/23726245
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/23724481
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/23726083
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/spy/14502022
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/spy/14502129
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/pr/391258


That’s about 1000 points higher than my Gigabyte 3080 Gaming OC on Timespy, does this bios work on Gigabyte cards?


----------



## markuaw1

Talon2016 said:


> https://www.3dmark.com/3dm/51349539
> 
> 
> -- 19,680 GPU in Time Spy.
> 
> Asus Strix OC vBIOS just made the FTW3 3080 the card to get IMO. With the Asus vBIOS it is basically shunt modding itself and hugely under reporting power draw and the card holds crazy high boost now. I just managed to score the #1 spot in the US with this vBIOS on my FTW3 Ultra. Reported power draw under max overclock and max fans was around 330w. It's under reporting but the card performance is still scaling and the clocks are boosting to over 2100Mhz and holding near 2100Mhz the entire TimeSpy run. Not quite sure of how or why it's doing this yet, but its working and working well .
> 
> Fans run at same 3000rpm max so no issues there. One DP was deactivated, but haven't tried HDMI. Works for me as I use 2 DP ports for my monitors and they work.
> 
> Asus Strix OC 3080 vBIOS shared by a nice new owner over at reddit. They did us a solid!
> 
> *Asus Strix OC 3080 vBIOS*
> 
> 
> 
> 
> 
> 
> 
> 
> File on MEGA
> 
> 
> 
> 
> 
> 
> 
> mega.nz


https://www.3dmark.com/pr/391691 WORKS GREAT THANK YOU


----------



## freejak13

markuaw1 said:


> https://www.3dmark.com/pr/391691 WORKS GREAT THANK YOU


Mind sharing your settings? Did you use precision or afterburner?


----------



## Vapochilled

One more doubt, peop


XxXSpitfireXxX said:


> That’s about 1000 points higher than my Gigabyte 3080 Gaming OC on Timespy, does this bios work on Gigabyte cards?


4k timespy? Bellow 4k it's.only cpu bound


----------



## XxXSpitfireXxX

Vapochilled said:


> One more doubt, peop
> 
> 4k timespy? Bellow 4k it's.only cpu bound


No not Timespy Extreme just Timespy, I'm around 18300 with my card, I'm wondering if the Gigabyte Gaming OC can be flashed to either the TUF OC or the Aorus Master. As it is now I'm around 340w-350w average and core clocks at around 1955-1995 with peaks around 2100, I'm curious to see if other bios give more stable clocks in the higher ranges near 2000+.


----------



## Chrisch

XxXSpitfireXxX said:


> No not Timespy Extreme just Timespy, I'm around 18300 with my card, I'm wondering if the Gigabyte Gaming OC can be flashed to either the TUF OC or the Aorus Master. As it is now I'm around 340w-350w average and core clocks at around 1955-1995 with peaks around 2100, I'm curious to see if other bios give more stable clocks in the higher ranges near 2000+.


its more i have a low voltage chip... my TUF runs 2100-2130MHz with ~0,95v and with that not that often in PL.


----------



## KingEngineRevUp

Question to those with FTW3 that are flash the Strix BIOS.

1. What happens to the RGB light bar?

2. Does it fix the fan issues with MSI Afterburner? 

EVGA cards can't have their fans controlled well via AB. At 39% and power fan speeds they spin and stop. From 40-90% fan speeds don't match RPMs. Only at 91%+ they do. Sometim the 3rd fan won't work at all, it just does its own thing.


----------



## delreylover

VPII said:


> Thanks a million


Have you tried it? Any different behavior?


----------



## BluePaint

19.922 Time Spy MSI Trio with Strix bios on air
Using my 7700k @5.1Ghz instead of 3900X @ 4650 (no smt) gave me some more points despite lower avg GPU clocks (worse cooling)


----------



## Talon2016

KingEngineRevUp said:


> Question to those with FTW3 that are flash the Strix BIOS.
> 
> 1. What happens to the RGB light bar?
> 
> 2. Does it fix the fan issues with MSI Afterburner?
> 
> EVGA cards can't have their fans controlled well via AB. At 39% and power fan speeds they spin and stop. From 40-90% fan speeds don't match RPMs. Only at 91%+ they do. Sometim the 3rd fan won't work at all, it just does its own thing.


1. The light bar continued to work for me, I was able to change LED colors in precision x, some reported it not working for them though. i.e they could no longer change color and previous color persisted.

2. All fans still spin at 100% (3000rpm) and can be controlled in AB or Precision X.



BluePaint said:


> 19.922 Time Spy MSI Trio with Strix bios on air
> Using my 7700k @5.1Ghz instead of 3900X @ 4650 (no smt) gave me some more points despite lower avg GPU clocks (worse cooling)


Very nice score!


----------



## KingEngineRevUp

Talon2016 said:


> 1. The light bar continued to work for me, I was able to change LED colors in precision x, some reported it not working for them though. i.e they could no longer change color and previous color persisted.
> 
> 2. All fans still spin at 100% (3000rpm) and can be controlled in AB or Precision X.
> 
> 
> 
> Very nice score!


Is Afterburner better now than it was before? What I mean is this. 

EVGA FTW3 stock bios, if you sit 75% fan speed, it will be at 60%, odd behavior. But from 90-100% it seems to start correcting itself. It's like it does its own thing really. 

So I'm wondering if the Strix OC BIOS will fix this issue, because I hate Precision X1.


----------



## shredy44

Hey guys has anyone here bios flashed the 3 x 8 pin colorful 3080 advanced oc?
I believe it’s got dual bios feature!

I found a tear down video here





can someone tell me if it’s got a 20 phase vrm configuration? is it a reference design pcb or custom? Towards the end of the video you can see the pcb clearly I can see it has 2 mlcc design 



https://evatech.com.au/product/6507/colorful-igame-rtx-3080-advance-oc-10g-eta-23102020


----------



## acoustic

Talon2016 said:


> https://www.3dmark.com/3dm/51349539
> 
> 
> -- 19,680 GPU in Time Spy.
> 
> Asus Strix OC vBIOS just made the FTW3 3080 the card to get IMO. With the Asus vBIOS it is basically shunt modding itself and hugely under reporting power draw and the card holds crazy high boost now. I just managed to score the #1 spot in the US with this vBIOS on my FTW3 Ultra. Reported power draw under max overclock and max fans was around 330w. It's under reporting but the card performance is still scaling and the clocks are boosting to over 2100Mhz and holding near 2100Mhz the entire TimeSpy run. Not quite sure of how or why it's doing this yet, but its working and working well .
> 
> Fans run at same 3000rpm max so no issues there. One DP was deactivated, but haven't tried HDMI. Works for me as I use 2 DP ports for my monitors and they work.
> 
> Asus Strix OC 3080 vBIOS shared by a nice new owner over at reddit. They did us a solid!


Holy sweet baby jesus, this BIOS is NUTS. I'm not going to turn my AC on just to run some benches right now, but playing Metro Exodus on the stock FTW3 Bios, I was pegging 400w with transient 420-430w spikes the entire time (3840x1600 / Extreme settings / RT max / DLSS off). With this BIOS, it's reporting 320w max. LOL. I must be pulling over 480w, there's no way. The only issue with this, is that temps are much higher. I went from 74-75c max to 82c at one point. The clock speed, even at 82c, was 2010-2025. Clocks are so much more steady now rather than the bouncy-castle rollercoaster that it was before due to the power limit getting slammed.

Pretty insane stuff; this BIOS opens the legs on the FTW3.


----------



## VPII

delreylover said:


> Have you tried it? Any different behavior?


Hi there, no difference. It is pretty frustrating as I can see those with the Tuf OC, Strix, FTW3 and even some other when you look at their Time Spy, Port Royal runs their average clocks sit way above 2000mhz as their power limit when increased uses the increased power limit. As I stated earlier, I am in conversation with Palit via email, they returned with some poor excuse and did not even touch on the topic of the +9% power limit increase not really doing much. Yes it use maybe 10 or 11 watt more but your result is only a frasction higher.


----------



## VPII

Out of interest I did a Port Royal run at my vcurve which basically takes max clocks up to 2130mhz. I did it with the card having the added 9% power limit and then without the 9% power limit. Seriously you looking at a 0.6% difference..... I mean sriously. During the +9% run the power usage max was 336.4watt and without the power limit increase 327.6watt. I mean seriously, this is sort of depressing. Make no mistake, the card is great can clock pretty well when testing at 1080P to see what clock speeds it keeps, but this here is not what I thought I'll see when getting 30watt more power.



https://www.3dmark.com/compare/pr/394510/pr/394505#


----------



## asdkj1740

3080 / 3090 / 3070 Gigabyte Eagle Gaming OC & Vision Power Connector Concerns


***UPDATE 1.5 IMPORTANT INFO To clarify for everyone and any one new here the cards affected are as follows re serial number WK39 onwards will have the revised new connector block *UPDATE however some cards may be mixed and still could be on the old connector block even after WK39 WK38...




www.overclockers.co.uk




gigabyte further discloses the exact batches being affected by the "faulty" adaptor.
users of gaming/ealge models can check the sn to see whether yours is affected.

please give this giga dude in uk some love for helping the community.


----------



## martin28bln

My daily setting now: [email protected] - max. 270-280W Peakpower 4K/3xWQHD --> Performance ist with mem OC equal to stock with 40-50W less power and heat


----------



## dev1ance

Got a Galax 3080 to use while I have an Xtreme and TUF OC on order. This card runs surprisingly cool for me, I keep temps below 65 degrees with fans at 2100RPM (58%) or so which is fine for me.
Max I can get is 18.8k graphics score in TimeSpy and about 12k in Port Royal (can't increase it further) using the modified Palit OC bios, hitting 355w max and OCed with +100 core/+550 mem.

Port Royal:11.9-12.05k
TimeSpy: 18.8k graphics score
TimeSpy Extreme: 9,427 graphics score
FireStrike Ultra: 11,537 graphics score
Superposition: 11547


----------



## martin28bln

dev1ance said:


> Max I can get is 18.8k graphics score in TimeSpy and about 12k in Port Royal (can't increase it further) using the modified Palit OC bios, hitting 355w max and OCed with +100 core/+550 mem.


Which Palit OC Bios you used?


----------



## dev1ance

martin28bln said:


> Which Palit OC Bios you used?


The 3080PALITocUpdate(350).rom


----------



## markuaw1

freejak13 said:


> Mind sharing your settings? Did you use precision or afterburner?


Yeah I did use Precision, power temp voltage maxed out + 920 mem + 90 clock room temp 20.5c 69F


----------



## MrBridgeSix

This thing with the power connectors on Gigabyte cards is not that they are faulty but rather the fact that they are exactly like an old Molex connector, the pins are loose so if you force them they can be punched back. I've had no problems but one should definitely be mindful of this when connecting and disconnecting the power cables, as so to not require you to disassemble the card to fix it or even to RMA it.

This new connector block is probably more sturdy and harder to punch the pins back, which could be an improvement for Gigabyte alone as in the lower RMA risk.


----------



## asdkj1740

MrBridgeSix said:


> This thing with the power connectors on Gigabyte cards is not that they are faulty but rather the fact that they are exactly like an old Molex connector, the pins are loose so if you force them they can be punched back. I've had no problems but one should definitely be mindful of this when connecting and disconnecting the power cables, as so to not require you to disassemble the card to fix it or even to RMA it.
> 
> This new connector block is probably more sturdy and harder to punch the pins back, which could be an improvement for Gigabyte alone as in the lower RMA risk.


there are lots of molex 4pin connector having such issue, and these are bad quality ones.
there are also some high quality molex connector like those bundled in noctua fans.
however this time gigabyte intends to use such strange adaptor, it could have been avioded at all if standard pcie connectors were used, or serious qc were done.


----------



## freejak13

asdkj1740 said:


> there are lots of molex 4pin connector having such issue, and these are bad quality ones.
> there are also some high quality molex connector like those bundled in noctua fans.
> however this time gigabyte intends to use such strange adaptor, it could have been avioded at all if standard pcie connectors were used, or serious qc were done.


Sent mine back to the retailer after the same thing happened. If gigabyte cut corners here who know where else. Glad I waited a bit and got the ftw3 ultra. Much better build-quality overall.


----------



## asdkj1740

freejak13 said:


> Sent mine back to the retailer after the same thing happened. If gigabyte cut corners here who know where else. Glad I waited a bit and got the ftw3 ultra. Much better build-quality overall.


maybe the retailer simply gave your another new one in store but they all come in the same old batch.

try to ask for a new adaptor instead of asking for replacement unless the retailer is sure about the replacement has the problem fixed, or simply return it and swap to another brand.


----------



## Adrian76

Does anyone have the Zotac trinity 3080 non OC bios with working RGB, Going to test to see if it fixes the RGB issue by reflashing with a known working trinity bios cheers.


----------



## Vapochilled

freejak13 said:


> Sent mine back to the retailer after the same thing happened. If gigabyte cut corners here who know where else. Glad I waited a bit and got the ftw3 ultra. Much better build-quality overall.



Same problem here, but in my case i took some time to align them... and no problems until today :\ but im wondering if i should just sent it back..
RMA will take long time ...

BTW, AORUS Master BIOS, if you do nvidia-smi.exe -q -d power 
its only 370W and not 380W as stated on page 1.

Could someone with AORUS confirm?
Because this is the highest TDP bios correct? (well... i guess its not anymore).... unless the dump was not taken properly


----------



## MrBridgeSix

Vapochilled said:


> Same problem here, but in my case i took some time to align them... and no problems until today :\ but im wondering if i should just sent it back..
> RMA will take long time ...
> 
> BTW, AORUS Master BIOS, if you do nvidia-smi.exe -q -d power
> its only 370W and not 380W as stated on page 1.
> 
> Could someone with AORUS confirm?
> Because this is the highest TDP bios correct? (well... i guess its not anymore).... unless the dump was not taken properly


You guys really having issues with the conector? I was just careful when connecting the power cables, made sure to wiggle them as to connect without forcing any of the pins and that was it, my serial number starts with 2035 what about you?


----------



## Vapochilled

MrBridgeSix said:


> You guys really having issues with the conector? I was just careful when connecting the power cables, made sure to wiggle them as to connect without forcing any of the pins and that was it, my serial number starts with 2035 what about you?


Same here. 2035. Pins didn't got in. I have to open a little my evga supernova PCIe pins and align the pins to fit... Real crap .. at the time I had no idea about this problem... I was just feeling stupid.


----------



## pazzoide76

Hello, i have an asus rtx 3080 tuf not oc. Do you think it makes sense to install the bios of the OC version?


----------



## Vapochilled

Btw, what about aorus master bios? The one I have downloaded is only 370w... Not 380. So... Same tdp like gaming oc

Can any aorus master owner check ?
Maybe this is related with dual bios ?


----------



## asdkj1740

Vapochilled said:


> Same here. 2035. Pins didn't got in. I have to open a little my evga supernova PCIe pins and align the pins to fit... Real crap .. at the time I had no idea about this problem... I was just feeling stupid.


try to apply rma and ask for a new adaptor. 
you wont know whether the contact is good until it burns.


----------



## MrBridgeSix

asdkj1740 said:


> try to apply rma and ask for a new adaptor.
> you wont know whether the contact is good until it burns.


That's mostly fearmongering, I'd say only RMA if you punched any of the pins back or if it came from the factory with pins pushed back.


----------



## Chrisch

pazzoide76 said:


> Hello, i have an asus rtx 3080 tuf not oc. Do you think it makes sense to install the bios of the OC version?


both can handle up to 375W, so it doenst make sense if you overclock manually.


----------



## freejak13

pazzoide76 said:


> Hello, i have an asus rtx 3080 tuf not oc. Do you think it makes sense to install the bios of the OC version?


Same TDP so not really aside from having a higher 'stock' boost clock.


----------



## freejak13

MrBridgeSix said:


> That's mostly fearmongering, I'd say only RMA if you punched any of the pins back or if came from the factory with pins pushed back.


Problem is, you wouldn't know if the pins were pushed in until you pull the connector out to examine (ala Shrodinger's Cat).


----------



## Vapochilled

asdkj1740 said:


> try to apply rma and ask for a new adaptor.
> you wont know whether the contact is good until it burns.





Vapochilled said:


> Btw, what about aorus master bios? The one I have downloaded is only 370w... Not 380. So... Same tdp like gaming oc
> 
> Can any aorus master owner check ?
> Maybe this is related with dual bios ?


i think i saw someone saying the silent bios ws reaching 380 but not the performance?


----------



## MrBridgeSix

freejak13 said:


> Problem is, you wouldn't know if the pins were pushed in until you pull the connector out to examine (ala Shrodinger's Cat).


You can pull the power connectors and try to push on the pins with a flathead, if they do not retract into the conector they are fine. But yeah if you are that worried go ahead and RMA, apparently in the UK Gigabyte is making it easy, elsewhere I fear it could be something that would take a couple of months.


----------



## Celeras

Has anyone been able to hit their max scores via undervolting yet? I haven't been able to match them up. Pumping everything up old school gives me a max clock of 2190mhz and an average of 1960mhz (and obviously big fluctuations) in Timespy. When I undervolt to just hold 1960mhz for basically the entire run, the scores are a few hundred points under. Having difficulty finding the sweet spot!


----------



## MrBridgeSix

Celeras said:


> Has anyone been able to hit their max scores via undervolting yet? I haven't been able to match them up. Pumping everything up old school gives me a max clock of 2190mhz and an average of 1960mhz (and obviously big fluctuations) in Timespy. When I undervolt to just hold 1960mhz for basically the entire run, the scores are a few hundred points under. Having difficulty finding the sweet spot!


Find the highest frequency at each voltage level in 25mv steps, that way you are going to get the best scores in benchmarks while power limited.

Be mindful that I found Time Spy and Port Royal to not be representative of gaming stability or even gaming clocks.


----------



## asdkj1740

MrBridgeSix said:


> That's mostly fearmongering, I'd say only RMA if you punched any of the pins back or if it came from the factory with pins pushed back.


not really, there is picture in that uk forum showing the cable is burnt.
bad contact increases temps of pin and cable and header.

replacing that adaptor is not difficult. of course rma is better in terms of risk.


----------



## Celeras

MrBridgeSix said:


> Find the highest frequency at each voltage level in 25mv steps, that way you are going to get the best scores in benchmarks while power limited.
> 
> Be mindful that I found Time Spy and Port Royal to not be representative of gaming stability or even gaming clocks.


Of course. What I mean is that I'm using the settings so the average frequency is equal but the score isn't quite matching up.


----------



## mouacyk

MrBridgeSix said:


> Find the highest frequency at each voltage level in 25mv steps, that way you are going to get the best scores in benchmarks while power limited.
> 
> Be mindful that I found Time Spy and Port Royal to not be representative of gaming stability or even gaming clocks.


Ugh, that should be fun. Aren't the new boost and power handicap lovely? Then all this work would be for nought, when an XOC bios gets released.


----------



## MrBridgeSix

Celeras said:


> Of course. What I mean is that I'm using the settings so the average frequency is equal but the score isn't quite matching up.





mouacyk said:


> Ugh, that should be fun. Aren't the new boost and power handicap lovely? Then all this work would be for nought, when an XOC bios gets released.


Yeah, it ain't fun at all, but my highest scores are when doing that, as I said it is optimal for scores when power limited, when power is out of the equation there's no reason to do it.


----------



## spajdr

Seems that Quake II RTX is the best so far to find if your voltage/clock is stable


----------



## Celeras

MrBridgeSix said:


> Yeah, it ain't fun at all, but my highest scores are when doing that, as I said it is optimal for scores when power limited, when power is out of the equation there's no reason to do it.


I definitely should be able to, not sure what the deal is. Here's a good example:

18825 GPU score w/ average frequency 1940mhz (max 2190mhz): https://www.3dmark.com/3dm/51574734
18625 GPU score w/ average frequency 1952mhz (max 1960mhz): https://www.3dmark.com/3dm/51634176

The first is everything maxed old school, the second is undervolting to 0.856v. Just cant figure out why the score is lower despite the higher average frequency. It is an actual average and not just the median or something, right?


----------



## Vapochilled

Anyone playing cod warzone here?
The game is crashing since I got the card. I don't have problems in other titles when using a fixed 1920 curve....

I think there is a memory issue... 
Memory is at 9571 all the time during the game...


----------



## KingEngineRevUp

Vapochilled said:


> Anyone playing cod warzone here?
> The game is crashing since I got the card. I don't have problems in other titles when using a fixed 1920 curve....
> 
> I think there is a memory issue...
> Memory is at 9571 all the time during the game...


It's a good game to test your OCs I'll tell you that. With Strix BIOs I can play Warzone with +115 Mhz. In multiplayer, I can only do +105 Mhz. 

I have +1200 on the memory right now.


----------



## Zeakie

Running extra tests with my zotac to palit mod it draws around 355w max running port royal hitting that every now and then.. so the limit on the card is just straight firmware related lazy cunts.


----------



## Riadon

My Gaming OC with stock bios is starting to occasionally pull 380w+ in RTX games even though the power limit is 370w, strange because when I got the card it barely went over 355w.










Picture was after a few hours of Metro Exodus, only change I've made is swapping my monitor to an LG CX


----------



## acoustic

Riadon said:


> My Gaming OC with stock bios is starting to occasionally pull 380w+ in RTX games even though the power limit is 370w, strange because when I got the card it barely went over 355w.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Picture was after a few hours of Metro Exodus, only change I've made is swapping my monitor to an LG CX


Metro seems some intense transient wattage spikes. On my stock FTW3 bios, I was seeing spikes to 425-430.


----------



## Mucho

So I tried many different bios on my Palit OC. Galax SG and Emtek Black Edition seem to have a Ref PCB bios and work on the Palit but only the Emtek has a 350W PL and a little higher Boost than the Palit. 
With the Aorus Master bios I get a bit more PL and it´s more stable holding the highest clock, but the problem is, only 1x DP and the HDMI are working, so I switched back to the Palit bios.


----------



## daveleebond

Zeakie said:


> Running extra tests with my zotac to palit mod it draws around 355w max running port royal hitting that every now and then.. so the limit on the card is just straight firmware related lazy cunts.


Thought there were fan issues using that BIOS and the TUF OC was better (for the Zotac) although you lose the middle DisplayPort? You experiencing any problems?


----------



## ppkstat

KingEngineRevUp said:


> EVGA cards can't have their fans controlled well via AB. At 39% and power fan speeds they spin and stop. From 40-90% fan speeds don't match RPMs. Only at 91%+ they do. Sometim the 3rd fan won't work at all, it just does its own thing.


Does the same applies to the xc3s?


----------



## Celeras

I have an XC3 but I can't check. HWinfo, GPU-Z, and Afterburner all only show 2 fan speeds. X1 is the only one that even shows the third fan for me.


----------



## djriful

Talon2016 said:


> https://www.3dmark.com/3dm/51349539
> 
> 
> -- 19,680 GPU in Time Spy.
> 
> Asus Strix OC vBIOS just made the FTW3 3080 the card to get IMO. With the Asus vBIOS it is basically shunt modding itself and hugely under reporting power draw and the card holds crazy high boost now. I just managed to score the #1 spot in the US with this vBIOS on my FTW3 Ultra. Reported power draw under max overclock and max fans was around 330w. It's under reporting but the card performance is still scaling and the clocks are boosting to over 2100Mhz and holding near 2100Mhz the entire TimeSpy run. Not quite sure of how or why it's doing this yet, but its working and working well .
> 
> Fans run at same 3000rpm max so no issues there. One DP was deactivated, but haven't tried HDMI. Works for me as I use 2 DP ports for my monitors and they work.
> 
> Asus Strix OC 3080 vBIOS shared by a nice new owner over at reddit. They did us a solid!
> 
> *Asus Strix OC 3080 vBIOS*
> 
> 
> 
> 
> 
> 
> 
> 
> File on MEGA
> 
> 
> 
> 
> 
> 
> 
> mega.nz


https://www.3dmark.com/spy/14525124  *3080 FE stock voltage + 10600k*
Graphics Score: *19 768 *


----------



## ssgwright

djriful said:


> https://www.3dmark.com/spy/14525124  *3080 FE stock voltage + 10600k*
> Graphics Score: *19 768 *


Nice score!


----------



## Reinhardovich773

djriful said:


> https://www.3dmark.com/spy/14525124  *3080 FE stock voltage + 10600k*
> Graphics Score: *19 768 *


This is so far the highest score i've seen on a 3080 in this thread. Are you by any chance using a water block to cool your card? 51 degrees is pretty cool for an FE card. It's either that or you're running fans at 100% in a very cool room haha. Still, impressive score!


----------



## Celeras

It's gotta be under water.. 2077mhz average frequency? Super high! My max is similar to his but average over 150mhz less.


----------



## ssgwright

no, i believe there was an EVGA (3xpin card) flashed with the ASUS strix bios that beat it


----------



## djriful

I'm running fan 100%. Ambient temp 22c.


----------



## ssgwright

how are those temps possible on air? that's in line with or better than water?


----------



## acoustic

https://www.3dmark.com/3dm/51643515?



3080 FTW3 w/ STRIX OC Bios. Stock voltage / 9900K @ 5Ghz.
Graphics Score: 19406.

Destroyed my Port Royal scores that I had with the FTW3 stock BIOS with all case fans 100%, A/C on, and super low ambient. Last PB was 12593 with voltage 100% and absolutely max clocks; I think my max temp was 48c. Crushed it with lower clocks and stock voltage.



https://www.3dmark.com/3dm/51643241?


Port Royal: 12639

There is a lot more points to be had here. I don't have the AC on, ambient in the basement is slightly warm from gaming earlier, and the card was already warmed up.


----------



## djriful

Open bench fan 100%, I have another fan blowing from the top. Several people on Nvidia discord said I have a binned 3080 FE. Never claim that but they said it is. lol


----------



## Celeras

djriful said:


> Open bench fan 100%, I have another fan blowing from the top. Several people on Nvidia discord said I have a binned 3080 FE. Never claim that but they said it is. lol


I see. So you're a no good dirty cheater


----------



## MrBridgeSix

You guys getting blue screens when pushing overclocks? I don't remember that being a thing with Pascal but Ampere has given me a couple of BSODs.


----------



## ssgwright

no no blue screens, even when my overclock is unstable my game will just crash, although I have hard locked a time or two


----------



## MrBridgeSix

ssgwright said:


> no no blue screens, even when my overclock is unstable my game will just crash, although I have hard locked a time or two


Yeah, hard locking is just like a BSOD right? Maybe even worse as there are no logs of it.

I think it might be driver maturity? I'm pretty sure my Pascal systems never behaved like that, but back then I joined some 8 months after Pascal release.


----------



## martinhal

I will be part of the group once my card arrives. Palits are quiet cheap here. If I get a Palit and add a water block it comes close to a Strix or Gigabyte Extreme in terms of money. I know I could also put one of the other cards under water but that would bring the cost closer to a Palit 3090. Aside from benchmarks , would a Palit under water be the same as say a Strix in terms of gaming performance ?


----------



## Celeras

I'm not quite sure what the problem is supposed to be with Afterburner + eVGA fan#3, however I popped open the case and can confirm that all 3 spin up on my XC3 Ultra when I go to 100% in AB.

I can't tell RPMs because Precision is the only thing that actually reads the third fan that I can find, and as soon as I open it it overwrites what AB does.


----------



## VPII

martinhal said:


> I will be part of the group once my card arrives. Palits are quiet cheap here. If I get a Palit and add a water block it comes close to a Strix or Gigabyte Extreme in terms of money. I know I could also put one of the other cards under water but that would bring the cost closer to a Palit 3090. Aside from benchmarks , would a Palit under water be the same as say a Strix in terms of gaming performance ?


Hi Martin, did you order from Woot? As for the performance, you'll struggle to reach Strix or FTW3 Ultra performance as your power limit is holding you back. Flashing it with a better bios would not really help as from what I have seen with my Palit, it is hard limited at 320watt and the added 9% power limit do not really give you that much more in performance. Still, the card performs great no problem there.


----------



## martinhal

Yes it is from Woot. Is the card loud under a gaming load ? I have seen the power limit is a bit low but for gaming use I would assume it would be perfect for the price. Looking at some of the reviews the FPS is a few frames at most vs top end cards ... 93 vs 90 not really a big deal.


----------



## favabean

Adrian76 said:


> Does anyone have the Zotac trinity 3080 non OC bios with working RGB, Going to test to see if it fixes the RGB issue by reflashing with a known working trinity bios cheers.





http://www.filedropper.com/3080-trinity


RGB working on my trinity non oc. Hope it works out for you!


----------



## Reinhardovich773

VPII said:


> Hi Martin, did you order from Woot? As for the performance, you'll struggle to reach Strix or FTW3 Ultra performance as your power limit is holding you back. Flashing it with a better bios would not really help as from what I have seen with my Palit, it is hard limited at 320watt and the added 9% power limit do not really give you that much more in performance. Still, the card performs great no problem there.


Hey man. I saw your top Time Spy graphics score on your Palit GamingPro OC 3080. I believe it was around 19.1K. I checked a review of the 3080 Strix OC and the stock graphics score was 18.2K, so a heavily OC'ed Palit 3080 OC card will beat a Strix 3080 at stock, provided of course you're in a cool room, have a case with great airflow and are running fans constantly at 100%.


----------



## Reinhardovich773

djriful said:


> Open bench fan 100%, I have another fan blowing from the top. Several people on Nvidia discord said I have a binned 3080 FE. Never claim that but they said it is. lol


Yeah you definitely got a golden sample and i'm sure most people here are super envious of you haha. BTW your 19.8K Time Spy graphics score beat a 3090 FE's stock score of 19.2K. So it's absolutely amazing that you were able to achieve that with a card as power constrained as the FE, which again proves that you got one of the best 3080 chips in the world. So congrats buddy!


----------



## gerardfraser

*** I may as well join in on the timespy run stuff. Of course not a chance running a PC game @4K with these clocks,maybe 1080p where the card is not stressed.
Asus tuf 3080
Avg clock 2004Mhz
Max 2175Mhz



https://www.3dmark.com/3dm/51649136?


3600XT


----------



## VPII

Reinhardovich773 said:


> Hey man. I saw your top Time Spy graphics score on your Palit GamingPro OC 3080. I believe it was around 19.1K. I checked a review of the 3080 Strix OC and the stock graphics score was 18.2K, so a heavily OC'ed Palit 3080 OC card will beat a Strix 3080 at stock, provided of course you're in a cool room, have a case with great airflow and are running fans constantly at 100%.


Hi there, I'm not sure. When I do bench the card the fans are running 100%, when I game it is like 50 to 60% so you hardly hear it and temps would ramain around 60c or so.


----------



## Adrian76

favabean said:


> http://www.filedropper.com/3080-trinity
> 
> 
> RGB working on my trinity non oc. Hope it works out for you!


Thanks very much i'll give it a shot.


----------



## KingEngineRevUp

ppkstat said:


> Does the same applies to the xc3s?


Yes. You have to open precision X1 if you don't see the third fan spinning, hit the lock button to force all three fans to be synch together, then close precision X1. It's a real pain in the ass. I think I see precision X1 open itself in the beginning of Windows and it unsynchs the fans each time.

Finally, AB doesn't report the fan speeds correctly, so from 40% to 90% you have to make a very aggressive fan curve beca it'll always be behind.

Example, 50% should be 1500 RPM but it'll be like 1100 RPM or something.

I would use precision X1 if someone can recommend me a OSD replacement for afterburner and Rivatuner.


----------



## Adrian76

Adrian76 said:


> Thanks very much i'll give it a shot.


Unfortunately it did not work RGB still broken hopefully Zotac pull their finger out their ass and get it fixed this week.


----------



## Vapochilled

MrBridgeSix said:


> Yeah, hard locking is just like a BSOD right? Maybe even worse as there are no logs of it.
> 
> I think it might be driver maturity? I'm pretty sure my Pascal systems never behaved like that, but back then I joined some 8 months after Pascal release.


I cant get TimeSpy Extreme to pass with my highest undervolt OC.
I pass if i decrease the curve, meaning less agressive undervolt.

But on COD Warzone.... i get 6068 after 3 rounds... never in the 1st round....
When i look at memory usage. I se a flat line on 9571mb

It just seems weird to me. Im planing to decrease from 5k to 2k resolution, play and see if i have the same problem.
Memory usage should decrease and therefore, directX 6068 error should go away.
If confirmed, then we have another memory driver issue....

I still dont have gaming stability..
3dMark results is less important here unless thats your only gaming guys hahahha


----------



## Vapochilled

BTW, anyone could upload both AORUS Master bios? Perf and silence? As i said, Perf bios is 370 and NOT 380.


----------



## lowrider_05

gerardfraser said:


> *** I may as well join in on the timespy run stuff. Of course not a chance running a PC game @4K with these clocks,maybe 1080p where the card is not stressed.
> Asus tuf 3080
> Avg clock 2004Mhz
> Max 2175Mhz
> 
> 
> 
> https://www.3dmark.com/3dm/51649136?
> 
> 
> 3600XT


Nice Job on the CPU Side of things but my TUF OC is a little stronger it seems with 19464 GPU Score
https://www.3dmark.com/spy/14450715

what are the OC settings of your 3600XT?


----------



## Zeakie

daveleebond said:


> Thought there were fan issues using that BIOS and the TUF OC was better (for the Zotac) although you lose the middle DisplayPort? You experiencing any problems?


I've tested both and get higher frames and scores with the palit bios give or take its a 1 to 3 fps difference.. all fans run perfectly in afterburner. I use hdmi so can't confirm the display port issue. 5 to 10fps improvements over stock zotac bios


----------



## MangixZ

Vapochilled said:


> BTW, anyone could upload both AORUS Master bios? Perf and silence? As





Vapochilled said:


> BTW, anyone could upload both AORUS Master bios? Perf and silence? As i said, Perf bios is 370 and NOT 380.


So is mine. Both versions of bios are 370W. And in actual use, it is about 340W.


----------



## Adrian76

I've just tried the Palit 3080 OC bios on the Zotac trinity 3080 it doesn't work there is still only 320 watts being pulled even though you can move the slider to +9% Now it's not doing anything for the power of 350W of the Palit also Heaven results exactly the same obviously because it's not using more than around 320w.


----------



## Zeakie

Adrian76 said:


> I've just tried the Palit 3080 OC bios on the Zotac trinity 3080 it doesn't work there is still only 320 watts being pulled even though you can move the slider to +9% Now it's not doing anything for the power of 350W of the Palit also Heaven results exactly the same obviously because it's not using more than around 320w.


Afterburner and nvidia performance tuner both show over 350w for me. I'm running a zotac oc you're on a regular one.. could they have hardware limited the non oc version?


----------



## VPII

Adrian76 said:


> I've just tried the Palit 3080 OC bios on the Zotac trinity 3080 it doesn't work there is still only 320 watts being pulled even though you can move the slider to +9% Now it's not doing anything for the power of 350W of the Palit also Heaven results exactly the same obviously because it's not using more than around 320w.


Look this is exactly the same for me with my Palit Gamingpro OC. Yes, it does use around 10 to 12 watts more when I increase it by the 9% but performance is only a little more.

I actually stripped my card, was able to remove the stickers covering two of the screws and restick them after putting the card back together again. I had to as my middle fan would not run for some reason.


----------



## Adrian76

Zeakie said:


> View attachment 2461954
> 
> Afterburner and nvidia performance tuner both show over 350w for me. I'm running a zotac oc you're on a regular one.. could they have hardware limited the non oc version?


Yeah i'm on a non OC Trinity, Afterburner reports the same 320w as the Zotac bios, Maybe the non OC is hardware limited somwhere.


----------



## Adrian76

VPII said:


> Look this is exactly the same for me with my Palit Gamingpro OC. Yes, it does use around 10 to 12 watts more when I increase it by the 9% but performance is only a little more.
> 
> I actually stripped my card, was able to remove the stickers covering two of the screws and restick them after putting the card back together again. I had to as my middle fan would not run for some reason.


Yeah that's the bios I tried the Palit OC no difference in performance whatsoever, you can push the slider to 109% but it doesn't do anything at all.


----------



## Zeakie

Adrian76 said:


> Yeah that's the bios I tried the Palit OC no difference in performance whatsoever, you can push the slider to 109% but it doesn't do anything at all.


For me leaving slider at100 it limits itself to 320 and when I do 109 it goes up to 355 havent seen it draw more.. and depending on game /test its a noticable bump in performance could be difference between a 55fps experience and a smooth 60.


----------



## VPII

Zeakie said:


> For me leaving slider at100 it limits itself to 320 and when I do 109 it goes up to 355 havent seen it draw more.. and depending on game /test its a noticable bump in performance could be difference between a 55fps experience and a smooth 60.


WHat card is it flashed with the Palit Bios.... as the Palit card itself appear to be hardware limited to 320watt, when you increase 9% for 350 watt you'll see max 336 in Time Spy and maybe 340watt in Port Royal, however if you were to run the Octane bench your card would pull 348watt no problem and clocks would remain max almost the entire benchmark.


----------



## Adrian76

Zeakie said:


> For me leaving slider at100 it limits itself to 320 and when I do 109 it goes up to 355 havent seen it draw more.. and depending on game /test its a noticable bump in performance could be difference between a 55fps experience and a smooth 60.


Yeah I tried both 100 then 109 power draw was exactly the same, I wonder why that is, They must have hard limited the non OC variants from what i've tested anyway.


----------



## Zeakie

VPII said:


> WHat card is it flashed with the Palit Bios.... as the Palit card itself appear to be hardware limited to 320watt, when you increase 9% for 350 watt you'll see max 336 in Time Spy and maybe 340watt in Port Royal, however if you were to run the Octane bench your card would pull 348watt no problem and clocks would remain max almost the entire benchmark.


Mine is the zotac trinity oc  flashed to updated palit oc bios.


----------



## Dreams-Visions

Whew this Strix OC is disgusting. 2115 MHz locked at pretty much all times. +150/+700. Peak power draw so far in normal gaming 440W. Peak temp 77C.

edit: sweet mercy.


----------



## VPII

Adrian76 said:


> Yeah I tried both 100 then 109 power draw was exactly the same, I wonder why that is, They must have hard limited the non OC variants from what i've tested anyway.


Not just the OC model, sort of the same for the OC model from what I gathered.


----------



## nievz

Dreams-Visions said:


> Whew this Strix OC is disgusting. 2115 MHz locked at pretty much all times. +150/+700. Peak power draw so far in normal gaming 440W. Peak temp 77C.
> 
> edit: sweet mercy.


what is your port royal score?


----------



## Dreams-Visions

nievz said:


> what is your port royal score?





















Also, pictures do not do this card justice. What a beauty. I keep looking at it. So well styled.

But yea, locked at 2115MHz. That Afterburner monitor shot was taken in the closing of a 2-hour gaming session where I had it working at as close to 100% as I could manage.


----------



## Reinhardovich773

nievz said:


> what is your port royal score?











Sweet mother of...


----------



## Adrian76

VPII said:


> Not just the OC model, sort of the same for the OC model from what I gathered.


Yeah i'm not sure to me it seems these are hard limited, haven't tried the TUF bios but i'm guessing if this doesn't work that won't either.


----------



## owikh84

*Strix RTX 3080 OC (Stock vBIOS) @ +150/+1000
10900K @ 5.1GHz
Maximus XII Formula (BIOS 0098)
4x8GB DDR4-4133 CL17
AX1500i
Bitspower Classic Strix GPU waterblock + backplate

Time Spy:* https://www.3dmark.com/3dm/51659947


*Port Royal:* https://www.3dmark.com/3dm/51660140






Ambient temp: 30C here in Malaysia.


----------



## VPII

Adrian76 said:


> Yeah i'm not sure to me it seems these are hard limited, haven't tried the TUF bios but i'm guessing if this doesn't work that won't either.


Don't bother it will not work or change much.

Sent from my SM-G960F using Tapatalk


----------



## Zeakie

VPII said:


> Don't bother it will not work or change much.
> 
> Sent from my SM-G960F using Tapatalk


You're on a palit card hes running zotac.. maybe the tuf might work for him.. some zotacs seem to be locked some hitting above 320w so its tinkering that will get results Id "bother" as it really let me zotac cards legs stretch a bit more by flashing it


----------



## VPII

Zeakie said:


> You're on a palit card hes running zotac.. maybe the tuf might work for him.. some zotacs seem to be locked some hitting above 320w so its tinkering that will get results Id "bother" as it really let me zotac cards legs stretch a bit more by flashing it


Wasn't sure, so sorry.


----------



## Zeakie

VPII said:


> Wasn't sure, so sorry.


No harm done help is more welcome than negativity. We all just want the best out of our models


----------



## gerardfraser

lowrider_05 said:


> Nice Job on the CPU Side of things but my TUF OC is a little stronger it seems with 19464 GPU Score
> https://www.3dmark.com/spy/14450715
> 
> what are the OC settings of your 3600XT?


It is ok you got a higher timespy score than me, I am not devastated at all . 3600XT is just an all core overclock ,nothing special at all.


Spoiler


----------



## TK421

3080 FE crossflash with other bios?


----------



## bobby_b

Did someone try gigabyte 3080 gaming oc bios on the palit 3080 gamingpro yet?


----------



## bobby_b

VPII said:


> WHat card is it flashed with the Palit Bios.... as the Palit card itself appear to be hardware limited to 320watt, when you increase 9% for 350 watt you'll see max 336 in Time Spy and maybe 340watt in Port Royal, however if you were to run the Octane bench your card would pull 348watt no problem and clocks would remain max almost the entire benchmark.


No, my card uses exactly 350 watt and then runs into the powerlimit (timespy). there is no hardware limitation.


----------



## sakete

For XC3 owners looking to slap on a waterblock, I think Optimus might be coming out with one soon (and I think one for FTW3 as well). Can't share anything more than that, but they currently have my 3080 XC3 for prototyping their block and I've been in contact with them.


----------



## TobyB

Any ASUS 3080 TUF/TUF OC owners struggling to unlock the power limit past 350w? I have mine on 110% in afterburner and it rarely increases above 350w in timespy, I've seen on other forums users having the same problem.


----------



## Mucho

bobby_b said:


> Did someone try gigabyte 3080 gaming oc bios on the palit 3080 gamingpro yet?


Yes, not working properly. The only working bios on the Palit is the Aorus Master bios, but here you loose 2 Display Ports


----------



## Chrisch

TobyB said:


> Any ASUS 3080 TUF/TUF OC owners struggling to unlock the power limit past 350w? I have mine on 110% in afterburner and it rarely increases above 350w in timespy, I've seen on other forums users having the same problem.


have u tried with other tools like HWInfo? i got on HWInfo ~372W reported where GPUz is max 355W


----------



## BluePaint

19.969 Time Spy GPU 3080 MSI with Strix bios + 7700K @ 5.1 Ghz instead of 3900X from before
Still on air but with open window. Hope to get 20.000 when it's a little colder outside


----------



## DrMorphine

If someone is interested, break down of Aorus Master 3080, not in english but still...


----------



## Alanzaki_073

sakete said:


> For XC3 owners looking to slap on a waterblock, I think Optimus might be coming out with one soon (and I think one for FTW3 as well). Can't share anything more than that, but they currently have my 3080 XC3 for prototyping their block and I've been in contact with them.


Frame Chasers have already come up with a video on YT with the Alphacool waterblock for XC3.


----------



## TobyB

Chrisch said:


> have u tried with other tools like HWInfo? i got on HWInfo ~372W reported where GPUz is max 355W


I've had a max of 373w in GPUz during a furmark bench but had a max of 360w in HWInfo, does anyone know which is more accurate?


----------



## gerardfraser

TobyB said:


> Any ASUS 3080 TUF/TUF OC owners struggling to unlock the power limit past 350w? I have mine on 110% in afterburner and it rarely increases above 350w in timespy, I've seen on other forums users having the same problem.


My Asus Tuf RTX 3080 hits over 380W in afterburner.


----------



## TobyB

gerardfraser said:


> My Asus Tuf RTX 3080 hits over 380W in afterburner.


What did you use to draw that power? I had 373w max value in afterburner running Furmark, do you think this is normal? Playing games under load it normally stays around 350w max


----------



## Mucho

Alanzaki_073 said:


> Frame Chasers


That´s the waterblock for the Ref PCB. It fits the XC3, but the PCB is a little bit longer, not so nice, but block gets the job done.


----------



## gerardfraser

TobyB said:


> What did you use to draw that power? I had 373w max value in afterburner running Furmark, do you think this is normal? Playing games under load it normally stays around 350w max


Just playing normal PC games in 4K Metro Exodus,Kingdom Come,Mafia 1/3 DE,Life is Strange 2,Red Dead Redemption 2,Horizon Zero Dawn,Devil May cry 5,Shadow of The Tomb Raider,Blair Witch . These are the PC games I played in the last month with RTX 3080 ,now I do not know which game in particular but when I check in Afterburner I see 380W+.


----------



## rambosbff

Hey dudes, I've been told My 11,883 score in port royal isn't very good. I have it undervolted to 925 and the average mhz was 1980 at 60c. I'm not sure what else I could do to achieve greatness. Help? If thermals aren't a problem just keep on increasing mhz at that voltage? I was told it's not that great from a discord thread I like. Thanks friends.


----------



## Vapochilled

gerardfraser said:


> Just playing normal PC games in 4K Metro Exodus,Kingdom Come,Mafia 1/3 DE,Life is Strange 2,Red Dead Redemption 2,Horizon Zero Dawn,Devil May cry 5,Shadow of The Tomb Raider,Blair Witch . These are the PC games I played in the last month with RTX 3080 ,now I do not know which game in particular but when I check in Afterburner I see 380W+.


Can you upload your bios? It might be a new version....
That would be the highest 2x pin bios...


----------



## gerardfraser

rambosbff said:


> Hey dudes, I've been told My 11,883 score in port royal isn't very good. I have it undervolted to 925 and the average mhz was 1980 at 60c. I'm not sure what else I could do to achieve greatness. Help? If thermals aren't a problem just keep on increasing mhz at that voltage? I was told it's not that great from a discord thread I like. Thanks friends.


Stop listening to the crazy people.The RTX 3080 cards give the same gaming experience and same FPS by a couple % (1FPS-5FPS)over all cards.These timespy/portroyal figures are not PC gaming stable overclocks.Enjoy your RTX 3080 for what it is ,a great 4K gaming card and if your worried someone got a couple FPS in a PC game more that you,then take a hard look at yourself.It is a computer part and it works fine for PC gaming.



Vapochilled said:


> Can you upload your bios? It might be a new version....
> That would be the highest 2x pin bios...


It is the same BIOS uploaded a bunch of times already.The card just spikes up to 380w+ it is not sustained,I get that benchmarking is fun and I love it also but the BIOS is the same as all the other BIOS I tested from Arous Master to Zotac Trinity.


----------



## Reinhardovich773

Hey there @VPII ! So i found something which might explain the bahaviour of your Palit OC 3080. Igor Wallossek from Igor's lab did a review of the card and found that the default vBIOS actually only has a max TBP rating of 340 W and *not* 350W, like the 9% slider would suggest.
I actually have the exact same card coming up this weekend, so i thought maybe we could ask Palit about a future vBIOS update that would allow the card to reliably pull more than 340 W of power, though i really don't think they're going to listen sadly.
Anyway, here's proof of what Igor found:









FInally, here's a link for his full review of the card (it's in German but you can easily translate it to English through Google Translate):

Palit GeForce RTX 3080 Gaming Pro Review - Vernünftiger Einstieg ins NVIDIA-Oberhaus | Seite 4 | igor´sLAB


----------



## VPII

bobby_b said:


> No, my card uses exactly 350 watt and then runs into the powerlimit (timespy). there is no hardware limitation.


I was asking which card model you have?


----------



## dvfedele

New XOC BIOS for 3080 FTW3 ULTRA available now https://forums.evga.com/EVGA-GeForce-RTX-3080-FTW3-3897-XOC-BIOS-BETA-m3118560.aspx

Due to many users request, have a new BETA BIOS that increases the maximum Power Target. This BIOS is only intended for the extreme overclocking user and does not have any other changes. Please note the following:


This update will increase the power consumption while overclocking, and is recommended you have adequate cooling and power (850w+ Gold minimum) when using this.
EVGA does not guarantee any performance increase or overclock while using this BIOS update.


----------



## Reinhardovich773

dvfedele said:


> New XOC BIOS for 3080 FTW3 ULTRA available now https://forums.evga.com/EVGA-GeForce-RTX-3080-FTW3-3897-XOC-BIOS-BETA-m3118560.aspx
> 
> Due to many users request, have a new BETA BIOS that increases the maximum Power Target. This BIOS is only intended for the extreme overclocking user and does not have any other changes. Please note the following:
> 
> 
> This update will increase the power consumption while overclocking, and is recommended you have adequate cooling and power (850w+ Gold minimum) when using this.
> EVGA does not guarantee any performance increase or overclock while using this BIOS update.


Hopefully other manufacturers can follow into EVGA's steps. (looking at you Palit!)


----------



## gerardfraser

Yeah are any of you people actually testing PC games FPS.It is like your all ******ed.A new BIOS will not give a miracle increase in performance.


----------



## Mucho

Reinhardovich773 said:


> Hey there @VPII ! So i found something which might explain the bahaviour of your Palit OC 3080. Igor Wallossek from Igor's lab did a review of the card and found that the default vBIOS actually only has a max TBP rating of 340 W and *not* 350W, like the 9% slider would suggest.
> I actually have the exact same card coming up this weekend, so i thought maybe we could ask Palit about a future vBIOS update that would allow the card to reliably pull more than 340 W of power, though i really don't think they're going to listen sadly.


Well, my Palit is using 350W and I dont know why my is capable to hit 350W and the other Palits hit only 340W


----------



## VPII

Reinhardovich773 said:


> Hey there @VPII ! So i found something which might explain the bahaviour of your Palit OC 3080. Igor Wallossek from Igor's lab did a review of the card and found that the default vBIOS actually only has a max TBP rating of 340 W and *not* 350W, like the 9% slider would suggest.
> I actually have the exact same card coming up this weekend, so i thought maybe we could ask Palit about a future vBIOS update that would allow the card to reliably pull more than 340 W of power, though i really don't think they're going to listen sadly.
> Anyway, here's proof of what Igor found:
> 
> View attachment 2462008
> 
> FInally, here's a link for his full review of the card (it's in German but you can easily translate it to English through Google Translate):
> 
> Palit GeForce RTX 3080 Gaming Pro Review - Vernünftiger Einstieg ins NVIDIA-Oberhaus | Seite 4 | igor´sLAB


When I run the bar file is shows


----------



## rambosbff

gerardfraser said:


> Stop listening to the crazy people.The RTX 3080 cards give the same gaming experience and same FPS by a couple % (1FPS-5FPS)over all cards.These timespy/portroyal figures are not PC gaming stable overclocks.Enjoy your RTX 3080 for what it is ,a great 4K gaming card and if your worried someone got a couple FPS in a PC game more that you,then take a hard look at yourself.It is a computer part and it works fine for PC gaming.
> 
> 
> It is the same BIOS uploaded a bunch of times already.The card just spikes up to 380w+ it is not sustained,I get that benchmarking is fun and I love it also but the BIOS is the same as all the other BIOS I tested from Arous Master to Zotac Trinity.


Thanks man, I'm not really worried about it but it made me think. I was pointed to this thread stating the minimal port royal score seems to average 12k minimum according to this thread. But I thought 1980mhz at 60c with 925v was nice. 

(Although then I was questioned why my score would be this low considering the mhz/temp I stated)


----------



## gerardfraser

rambosbff said:


> Thanks man, I'm not really worried about it but it made me think. I was pointed to this thread stating the minimal port royal score seems to average 12k minimum according to this thread. But I thought 1980mhz at 60c with 925v was nice.


Sorry to be an assholennmmm,your 1980mhz at 60c with 925v ,is a awesome . I ownendadid 3 RTX 3080 so far I know the real truth. There all the fking same.


----------



## rambosbff

gerardfraser said:


> Sorry to be an assholennmmm,your 1980mhz at 60c with 925v ,is a awesome . I ownendadid 3 RTX 3080 so far I know the real truth. There all the fking same.


Yeah, but we're tweakers and another tweaker got me tweakin'. Lol. Thanks for the response though, I think I'll just try to enjoy it now, but what's the fun of that without trying to break it first?


----------



## KingEngineRevUp

FTW3 Ultra XOC Bios















filehosting.org | free easy unlimited filehosting


filehosting.org free unlimited filehosting



www.filehosting.org


----------



## Reinhardovich773

VPII said:


> When I run the bar file is shows
> View attachment 2462010


Ah that is very interesting! Can i ask which vBIOS are you using here? I assume it's the updated Palit GamingPro OC one from their website right?


----------



## nycgtr

KingEngineRevUp said:


> View attachment 2462011
> 
> 
> 
> 
> 
> 
> 
> filehosting.org | free easy unlimited filehosting
> 
> 
> filehosting.org free unlimited filehosting
> 
> 
> 
> www.filehosting.org


From Jacob at evga.






EVGA GeForce RTX 3080 FTW3 XOC BIOS - EVGA Forums


Update 4/2 - Updated to the Resizable Bar Versions Due to many users request, have a new BIOS that increases the maximum Power Target. This BIOS is only intended for the extreme overclocking user and does not have any other changes. Please note the following: This update will increase the ...



forums.evga.com


----------



## Reinhardovich773

gerardfraser said:


> Yeah are any of you people actually testing PC games FPS.It is like your all ******ed.A new BIOS will not give a miracle increase in performance.


Thanks for your feedback.


----------



## VPII

Reinhardovich773 said:


> Ah that is very interesting! Can i ask which vBIOS are you using here? I assume it's the updated Palit GamingPro OC one from their website right?


Nope, I'm using either the one my card came with or one I got from a friend here in SA with the same card, both seem to run the same. But now I want to show you what I mean about running the Octane bench. In the graph look at the stated max power draw and look at the clock speed graph. During the run it only once dropped to 2040mhz and for the majority of the bench it was sitting 2100 to 2115mhz. Yup, 349watt usage. So not really sure why the performance is so drawn back in 3dmark when pulling more power in Octane bench and really heating up the core you keep clock speeds way above 2000mhz.


----------



## ssgwright

gerardfraser said:


> Yeah are any of you people actually testing PC games FPS.It is like your all ******ed.A new BIOS will not give a miracle increase in performance.


Why you in this thread dumping on everyone... you do know what this forum is for.... I mean the literal name of the site is overclock... in my opion you either 1. can't get your hands on a 3080 or 2. you did and it's a dud.


----------



## Reinhardovich773

VPII said:


> Nope, I'm using either the one my card came with or one I got from a friend here in SA with the same card, both seem to run the same. But now I want to show you what I mean about running the Octane bench. In the graph look at the stated max power draw and look at the clock speed graph. During the run it only once dropped to 2040mhz and for the majority of the bench it was sitting 2100 to 2115mhz. Yup, 349watt usage. So not really sure why the performance is so drawn back in 3dmark when pulling more power in Octane bench and really heating up the core you keep clock speeds way above 2000mhz.
> 
> View attachment 2462014


I think power consumption depends on the workload you're running (games, synthetic benchmarks, compute benchmarks etc...) and also on the vCore your GPU's currently running. For example this Octane benchmark could be pushing your vCore all the way up to 1.081 V (which is extremely close to the max vCore allowed on Turing/Ampere GPUs which is 1.093 V). On the other hand, when you're running a different workload (like Time Spy at 1440p for example), your GPU does not need to raise the vCore as high as the Octane benchmark (let's say it only reaches around 1.050 V vCore max) and as such you can't draw the maximum 350 W TBP of your graphics card because you're limited by the vCore (sometimes shown in GPU-Z as vOP, which means Operating Voltage).
Also, power draw can vary greatly from game to game. As an example, i remember that Control with RTX On at 1440p would make my previous Gigabyte RTX 2070 Gaming OC draw *a lot* more power than say, Red Dead Redemption II. I'd for example be drawing a maximum of 215 W (100% power limit) in RDR 2 whereas i'd easily reach and even exceed (!) 260 W (120% power limit) in Control with RTX ON. And that's on the exact same voltage/frequency curve, exact same power and temperature limits, exact same memory overclock and even at the same resolution with both games fully maxed out.


----------



## VPII

Reinhardovich773 said:


> I think power consumption depends on the workload you're running (games, synthetic benchmarks, compute benchmarks etc...) and also on the vCore your GPU's currently running. For example this Octane benchmark could be pushing your vCore all the way up to 1.081 V (which is extremely close to the max vCore allowed on Turing/Ampere GPUs which is 1.093 V). On the other hand, when you're running a different workload (like Time Spy at 1440p for example), your GPU does not need to raise the vCore as high as the Octane benchmark (let's say it only reaches around 1.050 V vCore max) and as such you can't draw the maximum 350 W TBP of your graphics card because you're limited by the vCore (sometimes shown in GPU-Z as vOP, which means Operating Voltage).
> Also, power draw can vary greatly from game to game. As an example, i remember that Control with RTX On at 1440p would make my previous Gigabyte RTX 2070 Gaming OC draw *a lot* more power than say, Red Dead Redemption II. I'd for example be drawing a maximum of 215 W (100% power limit) in RDR 2 whereas i'd easily reach and even exceed (!) 260 W (120% power limit) in Control with RTX ON. And that's on the exact same voltage/frequency curve, exact same power and temperature limits, exact same memory overclock and even at the same resolution with both games fully maxed out.


I understand completely what you are saying, but my question is why does the clock speeds vary as much in 3dmark but in octane it is almost constant and always above 2000mhz. Take these two runs, exactly the same settings. During the Octanebench, yes max vcore 1.081v, but in Time Spy it is just a tad lower 1.075v but the clock speeds drops like crazy. I'll say it again, the card is in 3d hardware limited to 320watt and that is how I see it.


----------



## Reinhardovich773

VPII said:


> I understand completely what you are saying, but my question is why does the clock speeds vary as much in 3dmark but in octane it is almost constant and always above 2000mhz. Take these two runs, exactly the same settings. During the Octanebench, yes max vcore 1.081v, but in Time Spy it is just a tad lower 1.075v but the clock speeds drops like crazy. I'll say it again, the card is in 3d hardware limited to 320watt and that is how I see it.
> 
> View attachment 2462023
> View attachment 2462024


If your card was truly hardware limited as you said then how come is it pulling nearly 360W in OctaneBench? As i explained to you before power consumption depends on many factors and just because your card doesn't draw its entire allowed TBP in Time Spy (which runs at 1440p BTW so not even 4K) does not mean that there is a hardware limitation in place. If you want to be 100% sure, you can ask someone else who has a different card but with the exact same configured TBP, the exact same voltave/frequency curve, the exact same memory overclocks and with both GPUs running at around the same temperatures, with absolutely no CPU or RAM speed/timings bottlenecks on either system.


----------



## martin28bln

That's because the workload is varying. Both Benches have different workloads and are also varying while the run (for Timespy it´s sure). So the card is hittung PL/sometimes not and when high workloads it´s looking in clocktable and then put more or less voltage. Sorry but hope it was a bit explained....

That´s why I have set my points with an AB curve which is normaly not resulting in fluctaing frequency.


----------



## VPII

Reinhardovich773 said:


> If your card was truly hardware limited as you said then how come is it pulling nearly 360W in OctaneBench? As i explained to you before power consumption depends on many factors and just because your card doesn't draw its entire allowed TBP in Time Spy (which runs at 1440p BTW so not even 4K) does not mean that there is a hardware limitation in place. If you want to be 100% sure, you can ask someone else who has a different card but with the exact same configured TBP, the exact same voltave/frequency curve, the exact same memory overclocks and with both GPUs running at around the same temperatures, with absolutely no CPU or RAM speed/timings bottlenecks on either system.


My friend, I firmly understand what you are saying. But if you look at the clock speed differences, during the 3dmark run clock speeds drop the moment it reach or pass 320watt which is not the same while running octanebench. I really do understand the power limits, but why advertise that you get up to 320watt +9% power limit when it is actually just 320watt power limit. It does not make sense to me. Yes, if I were to run Time Spy without the 9% power limit it would draw at most around 325watt and with the added 9% power limt 339watt maybe 340watt. I do understand why it would be lower than Octanebench, but I do not understand why the clock speeds drop to 1905mhz during 3dmark and the lowest during Octanebench is like 2040mhz maybe even more.


----------



## DStealth

Anyone tried XOC Bios on 2*8pin card, already any benefits ? As I'm not home and it's interesting if it will give some headroom as standart FTW3 and Strix doesn't ...


----------



## lowrider_05

DStealth said:


> Anyone tried XOC Bios on 2*8pin card, already any benefits ? As I'm not home and it's interesting if it will give some headroom as standart FTW3 and Strix doesn't ...


Its the same as the Strix OC Bios, because its NOT an XOC Bios, it just has the same 450 Watt limit as the Strix OC now.


----------



## ssgwright

ya I tried a 3x8 pin bios on my 2x8 pin card and gpuz reads a higher watt pull (which is false due to gpuz detecting 3x8 pin) but you'll get less performance because you're actually getting a 1/3rd less power


----------



## zhrooms

XOC means *E*xtreme *O*ver*c*locking, it's a term used for sub-zero cooling, such as DICE/LN2/Helium and so on.

_*Extreme Overclocking* BIOS for *Extreme Cooling* Methods_

These BIOSes remove power and temperature limits. EVGA Jacob saying this is an *XOC* BIOS is (e)Xtremely misleading, straight up *lying*.

EVGA has been a ****show since day one of Ampere, first they went out with the the capacitor statement calling them "SP" caps which was incorrect, as SP caps are specifically Panasonic, and they clearly weren't. That should be impossible to get wrong, yet it happened, they didn't even bother correcting it after dozens of people called them out.

Then _EVGA Jacob_ went out of his way on twitter to respond to multiple users asking about the power limit, saying RTX 3080 FTW3 would feature a power limit of 420W and RTX 3090 FTW3 a power limit of 440W, made it sound like it was a lot too, which it isn't, then when the cards first started shipping, review samples had a limit of 400W, and of course that's what the retail samples actually shipped with too, going against what was promised, 400W on the 3080 and 450W on the 3090 were the final numbers, 20 less and 10 more respectively.

Now after everyone has been flashing the competitors (Strix) BIOS on their card because it offered another 50W for overclocking, they did something about it, it's been 29 days since RTX 3080 went on sale, took them that long to issue a simple BIOS update, that should've been shipping on review samples & retail in the first place. And 450W is not special, they could safely offer users up to 520W if they wanted to as that's the 3x8-Pin spec (safe).

Then there's the "red clown lips" debacle, basically no one liked them so how the hell did that get approved as a final design decision? They're offering free replacement ones now because people are that upset, costing them customers and money shipping free replacement parts, that (red) should never have shipped with it in the first place.

Also need to mention the XC3 pricing, 3090 is $30 above FE with no factory OC or backplate, for an extra $20 you get the backplate, and for yet another price bump of $70 this time, you get.. 30MHz factory overclock. XC3 Ultra is the worst card (price/performance) on the market currently (shared last place with MSI Ventus OC).

Then XC3 had/has a bugged BIOS, a lot of people have reported the 366W power limit does not work, they are capped at around 330W, no update has been issued as far as I know, and it's been a month, maybe after they issued an update for FTW3 a XC3 one isn't far off.

To sum it up, all of these things combined, there's no way you can look at EVGA in a positive way, they've made so many mistakes while some other partners has made none, I've owned multiple EVGA cards back in the day (500, 700 & 1000 series) but in the last few years it's gotten harder and harder to recommend their products. One of the main things about EVGA has been warranty, they've been very open and vocal about allowing their users to replace the cooler and thermal paste, which feels "safe" to the users, but this policy has actually been offered by other brands as well, just not in the same vocal way, and we have a lot of consumer protection in the EU as an example (NA too), and most RMA locations are located in Germany regardless of brand so it doesn't really matter, shipping cost should be the same, as well as delivery/return time. (Gigabyte actually offer an extra year of RMA warranty, that's a big deal when selling the card used when/if you get a new one)

Right now after this BIOS update, the FTW3 non Ultra has the highest value out of any card, by far (it's $60 less than Strix), so I'm still going to recommend it, *but with caution*. If you can get a Strix (OC or Non-OC it doesn't matter) for the same price as a FTW3/FTW3 Ultra (also doesn't matter), I'd strongly recommend to get the Strix over FTW3, for the above reasons. ASUS did everything right with Ampere, showed they cared about the consumers, EVGA made multiple mistakes showing they don't care (as much) about us, this is the first time with the BIOS update that they showed anything (positive) at all, and it was a month late. So I'm not going to commend EVGA for this, should've never happened in the first place, this was completely avoidable.


----------



## cstkl1

as i was doing waterblock assembly for strix 3080. this card sux btw.. on air it only boost 2070 and constant 2040-2055..
@owikh84 does 2145 constant...

strix 3090 confirmed and otw in 1 hour...


----------



## asdkj1740

gamersnexus got 800w bios for xoc, while the rest got 450w for xoc.
lmao.



zhrooms said:


> XOC means *E*xtreme *O*ver*c*locking, it's a term used for sub-zero cooling, such as DICE/LN2/Helium and so on.
> 
> _*Extreme Overclocking* BIOS for *Extreme Cooling* Methods_
> 
> These BIOSes remove power and temperature limits. EVGA Jacob saying this is an *XOC* BIOS is (e)Xtremely misleading, straight up *lying*.
> 
> EVGA has been a ****show since day one of Ampere, first they went out with the the capacitor statement calling them "SP" caps which was incorrect, as SP caps are specifically Panasonic, and they clearly weren't. That should be impossible to get wrong, yet it happened, they didn't even bother correcting it after dozens of people called them out.
> 
> Then _EVGA Jacob_ went out of his way on twitter to respond to multiple users asking about the power limit, saying RTX 3080 FTW3 would feature a power limit of 420W and RTX 3090 FTW3 a power limit of 440W, made it sound like it was a lot too, which it isn't, then when the cards first started shipping, review samples had a limit of 400W, and of course that's what the retail samples actually shipped with too, going against what was promised, 400W on the 3080 and 450W on the 3090 were the final numbers, 20 less and 10 more respectively.
> 
> Now after everyone has been flashing the competitors (Strix) BIOS on their card because it offered another 50W for overclocking, they did something about it, it's been 29 days since RTX 3080 went on sale, took them that long to issue a simple BIOS update, that should've been shipping on review samples & retail in the first place. And 450W is not special, they could safely offer users up to 520W if they wanted to as that's the 3x8-Pin spec (safe).
> 
> Then there's the "red clown lips" debacle, basically no one liked them so how the hell did that get approved as a final design decision? They're offering free replacement ones now because people are that upset, costing them customers and money shipping free replacement parts, that (red) should never have shipped with it in the first place.
> 
> Also need to mention the XC3 pricing, 3090 is $30 above FE with no factory OC or backplate, for an extra $20 you get the backplate, and for yet another price bump of $70 this time, you get.. 30MHz factory overclock. XC3 Ultra is the worst card (price/performance) on the market currently (shared last place with MSI Ventus OC).
> 
> Then XC3 had/has a bugged BIOS, a lot of people have reported the 366W power limit does not work, they are capped at around 330W, no update has been issued as far as I know, and it's been a month, maybe after they issued an update for FTW3 a XC3 one isn't far off.
> 
> To sum it up, all of these things combined, there's no way you can look at EVGA in a positive way, they've made so many mistakes while other partners has made none, I've owned multiple EVGA cards back in the day (500, 700 & 1000 series) but in the last few years it's gotten harder and harder to recommend their products. One of the main things about EVGA has been warranty, they've been very open and vocal about allowing their users to replace the cooler and thermal paste, which feels "safe" to the users, but this policy has actually been offered by other brands as well, just not in the same vocal way, and we have a lot of consumer protection in the EU as an example (NA too), and most RMA locations are located in Germany regardless of brand so it doesn't really matter, shipping cost should be the same, as well as delivery/return time. (Gigabyte actually offer an extra year of RMA warranty, that's a big deal when selling the card used when/if you get a new one)
> 
> Right now after this BIOS update, the FTW3 non Ultra has the highest value out of any card, by far (it's $60 less than Strix), so I'm still going to recommend it, *but with caution*. If you can get a Strix (OC or Non-OC it doesn't matter) for the same price as a FTW3/FTW3 Ultra (also doesn't matter), I'd strongly recommend to get the Strix over FTW3, for the above reasons. ASUS did everything right with Ampere, showed they cared about the consumers, EVGA made multiple mistakes showing they don't care (as much) about us, this is the first time with the BIOS update that they showed anything (positive) at all, and it was a month late. So I'm not going to commend EVGA for this, should've never happened in the first place, this was completely avoidable.


you forgot the evga version of fe adaptor! too rude!



DrMorphine said:


> If someone is interested, break down of Aorus Master 3080, not in english but still...


same pcb like eagle and gaming oc, and eventually gives us back the single phase removed on eagle/gaming pcb. 
no wonder why the power limit is almost the same as gaming oc.

and charging for $849usd...


----------



## KingEngineRevUp

zhrooms said:


> To sum it up, all of these things combined, there's no way you can look at EVGA in a positive way,


That's funny, almost everything you said in your post are reasons I see them on a positive light.


----------



## GTANY

zhrooms said:


> XOC means *E*xtreme *O*ver*c*locking, it's a term used for sub-zero cooling, such as DICE/LN2/Helium and so on.
> 
> _*Extreme Overclocking* BIOS for *Extreme Cooling* Methods_
> 
> These BIOSes remove power and temperature limits. EVGA Jacob saying this is an *XOC* BIOS is (e)Xtremely misleading, straight up *lying*.
> 
> EVGA has been a ****show since day one of Ampere, first they went out with the the capacitor statement calling them "SP" caps which was incorrect, as SP caps are specifically Panasonic, and they clearly weren't. That should be impossible to get wrong, yet it happened, they didn't even bother correcting it after dozens of people called them out.
> 
> Then _EVGA Jacob_ went out of his way on twitter to respond to multiple users asking about the power limit, saying RTX 3080 FTW3 would feature a power limit of 420W and RTX 3090 FTW3 a power limit of 440W, made it sound like it was a lot too, which it isn't, then when the cards first started shipping, review samples had a limit of 400W, and of course that's what the retail samples actually shipped with too, going against what was promised, 400W on the 3080 and 450W on the 3090 were the final numbers, 20 less and 10 more respectively.
> 
> Now after everyone has been flashing the competitors (Strix) BIOS on their card because it offered another 50W for overclocking, they did something about it, it's been 29 days since RTX 3080 went on sale, took them that long to issue a simple BIOS update, that should've been shipping on review samples & retail in the first place. And 450W is not special, they could safely offer users up to 520W if they wanted to as that's the 3x8-Pin spec (safe).
> 
> Then there's the "red clown lips" debacle, basically no one liked them so how the hell did that get approved as a final design decision? They're offering free replacement ones now because people are that upset, costing them customers and money shipping free replacement parts, that (red) should never have shipped with it in the first place.
> 
> Also need to mention the XC3 pricing, 3090 is $30 above FE with no factory OC or backplate, for an extra $20 you get the backplate, and for yet another price bump of $70 this time, you get.. 30MHz factory overclock. XC3 Ultra is the worst card (price/performance) on the market currently (shared last place with MSI Ventus OC).
> 
> Then XC3 had/has a bugged BIOS, a lot of people have reported the 366W power limit does not work, they are capped at around 330W, no update has been issued as far as I know, and it's been a month, maybe after they issued an update for FTW3 a XC3 one isn't far off.
> 
> To sum it up, all of these things combined, there's no way you can look at EVGA in a positive way, they've made so many mistakes while other partners has made none, I've owned multiple EVGA cards back in the day (500, 700 & 1000 series) but in the last few years it's gotten harder and harder to recommend their products. One of the main things about EVGA has been warranty, they've been very open and vocal about allowing their users to replace the cooler and thermal paste, which feels "safe" to the users, but this policy has actually been offered by other brands as well, just not in the same vocal way, and we have a lot of consumer protection in the EU as an example (NA too), and most RMA locations are located in Germany regardless of brand so it doesn't really matter, shipping cost should be the same, as well as delivery/return time. (Gigabyte actually offer an extra year of RMA warranty, that's a big deal when selling the card used when/if you get a new one)
> 
> Right now after this BIOS update, the FTW3 non Ultra has the highest value out of any card, by far (it's $60 less than Strix), so I'm still going to recommend it, *but with caution*. If you can get a Strix (OC or Non-OC it doesn't matter) for the same price as a FTW3/FTW3 Ultra (also doesn't matter), I'd strongly recommend to get the Strix over FTW3, for the above reasons. ASUS did everything right with Ampere, showed they cared about the consumers, EVGA made multiple mistakes showing they don't care (as much) about us, this is the first time with the BIOS update that they showed anything (positive) at all, and it was a month late. So I'm not going to commend EVGA for this, should've never happened in the first place, this was completely avoidable.


I agree : EVGA is not the best choice on Ampere. I am waiting for a 3090 FTW3 because of its low price (probably a price error) but my first choice was an ASUS 3090 Strix. Nevertheless, I will not regret my purchase if EVGA launches a 520 W bios for this card.


----------



## KingEngineRevUp

zhrooms said:


> they've made so many mistakes while other partners has made none,


Really?

Gigabyte has messed up on almost every card they have. Their 8-pins are popping out. The Xtreme doesn't have a bios to go above 370W, it's essentially just a master right now.

MSI Trios is built up of mediocre PCB parts, has 3x 8-pins but a 350W TDP. Cost more than a TUF non-OC but according to it's BOM, it should cost as much as an FE. It doesn't even have a BIOS switch. Oh yeah, the "graphene" plastic backplate also.

ASUS tried riding the MLCC bullshit when their ASUS TUF cards were crashing on multiple systems also. Tech City Yes, gear seek, Hardware Unboxed all had TUF cards that crashed even with the whole glorious MLCC arrangements they had.

I mean, come on. It sounds like you're just giving EVGA **** just to give them **** honestly.

I have no problem with critiquing them fairly. But to say "no other partner messeded up" is completely bias.


----------



## zhrooms

GTANY said:


> if EVGA launches a 520 W bios for this card.


They won't, that power limit is reserved for Kingpin, which will run either 520 or 525W.



KingEngineRevUp said:


> Really? I mean, come on. It sounds like you're just giving EVGA *** just to give them *** honestly.


I didn't say no other partner has made mistakes, other doesn't mean all. And I'm going to sum up all positives and negatives in the original post next to the partner cards, already started on some but will take time to flesh it out properly.



asdkj1740 said:


> gamersnexus got 800w bios for xoc, *while the rest got 450w for xoc.*


Please don't call it XOC, because it's not, and neither is GamersNexus 800W, it's just a regular BIOS with power limit increased for some influencers/YouTubers (promotional purposes), as 800W is nowhere near enough for LN2 (with voltage control), I'm assuming they put it at 800W so it wouldn't throttle at the stock 1.1V.


----------



## KingEngineRevUp

zhrooms said:


> I didn't say no other partner has made mistakes, other doesn't mean all.


That's bullshit, I have your quote right here.

"they've made so many mistakes while other partners has made none,"


----------



## zhrooms

KingEngineRevUp said:


> That's bullshit, I have your quote right here.
> 
> "they've made so many mistakes while other partners has made none,"


I added "some" in front of _other_ about 10 minutes before you made this reply, actually edited it right away when I saw your first reply, maybe refresh the page before you start quoting? Also, maybe use your brain for once? There are 14 other brands that I know of, offering Ampere cards, *it's extremely obvious* I don't mean *no other brand* has made a single mistake/bad decision.


----------



## asdkj1740

zhrooms said:


> Please don't call it XOC, because it's not, and neither is GamersNexus 800W, it's just a regular BIOS with power limit increased for some influencers/YouTubers (promotional purposes), as 800W is nowhere near enough for LN2 (with voltage control), I'm assuming they put it at 800W so it wouldn't throttle at the stock 1.1V.


indeed, but we are all here for bios with high enough power limit.
i still havent seen a shund mod that can keep the voltage locked on max level like 1.1v, thats is what a high power limiit bios like 800w can solve, i guess.


----------



## dr.Rafi

VPII said:


> Don't bother it will not work or change much.
> 
> Sent from my SM-G960F using Tapatalk
> I tried asus strix oc rtx 3080 rom on msi ventus rtx 3080 ASUS HAS 3 POWER PLUGS AND MSI ONLY 2 it worked iam getting 2150 gpu clock stable on many games and gpu power board is reaching 460 watt max but strix bios trick shunt mod software and its showing gpu z only 90 watt on one of the 8 pin power , bt the memory is hard to overclock on ventus because i think they use cheap memory batch, or because iam not using watercool on memory only on the gpu but the memory and power circuts used only small heatsinks with powerfull fan waiting to get special water block which is not in stock for most 3000 family only bitspower have in stock but not for ventus msi .


----------



## asdkj1740

for tuf mlcc cap combination, as igor descripted, mlcc is mainly for buffering. to me it seems asus has tuned the tuf too aggressively by using those additional buffering for reaching higher frequency instead. also without proper cooling like what gigabyte did in the past on some aorus card with a copper plate to cool down those caps under the gpu socket, the mlcc degrades in performance.

it is interesting to see gigabyte still chooses to use full spcap for aorus cards, while some vendors like gainward/colorful/galax has changed back to 1mlcc+5spcap.


----------



## HyperMatrix

Someone posted this score on EVGA forums with the new 450W bios. This was on air, and temps hitting 68c. So doesn't appear to have been AC-boxed.  That's higher than some of the lower end RTX 3090 cards like Zotac which makes the worst cards ever. Impressive numbers from the FTW3. Just as a note...don't expect these scores. This guy obviously got a card with a better than average chip on it. But it's still impressive seeing these numbers on an air cooled 3080


----------



## asdkj1740

HyperMatrix said:


> Someone posted this score on EVGA forums with the new 450W bios. This was on air, and temps hitting 68c. So doesn't appear to have been AC-boxed.  That's higher than some of the lower end RTX 3090 cards like Zotac which makes the worst cards ever. Impressive numbers from the FTW3. Just as a note...don't expect these scores. This guy obviously got a card with a better than average chip on it. But it's still impressive seeing these numbers on an air cooled 3080
> 
> View attachment 2462032


honestly, to me, evga rtx3000 series models are marked up $50usd. 
if ftw3 (non ultra) was priced at 749//759, it will be a good deal.
not to mention those xc3 cards, totally not worth it.


----------



## DStealth

HyperMatrix said:


> w 450W bios. This was on air, and temps hitting 68c. S


Wow 4fps higher than the worst and power limited 3080 [email protected]


----------



## Celeras

Think I pretty much maxed out my XC3 Ultra with 19154 GPU score: https://www.3dmark.com/spy/14564251

Voltage/clocks dropping too swiftly to squeeze out much more, the variance between my max/average clocks is huge. But I still think that's pretty good considering I am stuck with 2pins and 340W (should be 366W but I never see close to that). Maybe my model will get a fake XOC Bios soon too


----------



## dr.Rafi

gerardfraser said:


> Stop listening to the crazy people.The RTX 3080 cards give the same gaming experience and same FPS by a couple % (1FPS-5FPS)over all cards.These timespy/portroyal figures are not PC gaming stable overclocks.Enjoy your RTX 3080 for what it is ,a great 4K gaming card and if your worried someone got a couple FPS in a PC game more that you,then take a hard look at yourself.It is a computer part and it works fine for PC gaming.
> 
> 
> It is the same BIOS uploaded a bunch of times already.The card just spikes up to 380w+ it is not sustained,I get that benchmarking is fun and I love it also but the BIOS is the same as all the other BIOS I tested from Arous Master to Zotac Trinity.


Try Strix oc is giving me 455 to 460 watt i tried it on my ventus msi 3080 320 watt defult and dont worry for bieng 2 power or 3 power connector i tried every bios i find on this forum aorus master evga ultra, msi trio, and finally asus strix is all working the only bad senario using pallet bios gave me loops of stuttring to black screen and back when the driver is enabled, I disbled the graphic drivers and and reflash to other stable bios everything back to normal but not much gain was aorus master, i did the same on 2080 ti family i had asus dual fan and ventus slapped them with every bios i found online even tried kingpin and asus matrix and all the unverified, the worst senario was exacly same i dont remember which one but some bioses gave me stuttering screen which was fixed by flashing back ,other bad senarios which i dont care about is losing some display ports or hdmi, that because some card use different display processor chips and circuts for their card.but be sure when find your card pulling alot of power try to observe the power circut temprature on your card by infra red or probs if your card dont have built in probs for these circuts, and provide adquete cooling when you pushing you card to limit . good luck and sorry for my moderate english.


----------



## HyperMatrix

DStealth said:


> Wow 4fps higher than the worst and power limited 3080 [email protected]


3 fps is only 1 fps higher than 2 fps. 3 fps is also 50% more fps than 2 fps. So this card is approximately 9% faster than your card.


----------



## dr.Rafi

bobby_b said:


> Did someone try gigabyte 3080 gaming oc bios on the palit 3080 gamingpro yet?


Try Asus Strix oc trust me , miracle withoverclocking , dont use tuf it wont do anything


----------



## Zeakie

dr.Rafi said:


> Try Asus Strix oc trust me , miracle withoverclocking , dont use tuf it wont do anything


Mind linking it? Will give it a go on my zotac


----------



## acoustic

I'm not sure why I'd flash the new FTW3 bios to my FTW3 when the STRIX BIOS is basically glitching and giving me unlimited power limit..


----------



## Zelo

3080 Aorus Master owner here. Why am I getting a lower Time Spy score when I OC to +100/+700.

Pre OC Time Spy Graphics Score: 18080 -- Overall: 17173
OC(+100/+700) Graphics Score: 16817 -- Overall: 16161

I ran it a few times and got similar results.


----------



## HyperMatrix

asdkj1740 said:


> honestly, to me, evga rtx3000 series models are marked up $50usd.
> if ftw3 (non ultra) was priced at 749//759, it will be a good deal.
> not to mention those xc3 cards, totally not worth it.


The EVGA FTW3 is $750 if you buy it directly from them. Just use the standard associates discount code.


----------



## MangixZ

New vBIOS for AORUS GeForce RTX™ 3080 MASTER 10G available now.





AORUS GeForce RTX™ 3080 MASTER 10G (rev. 1.0) 图片展示 | 显卡 - GIGABYTE 技嘉科技


玩家推荐AORUS游戏显卡汇集WINDFORCE 风之力散热系统、严选超频GPU、炫彩RGB灯、三防保护涂层, 与专为VR优化的接口设计, 给你极致震撼的游戏效能及VR体验！




www.gigabyte.cn





Can anyone try and give feedback.


----------



## Chrisch

Zelo said:


> 3080 Aorus Master owner here. Why am I getting a lower Time Spy score when I OC to +100/+700.
> 
> Pre OC Time Spy Graphics Score: 18080 -- Overall: 17173
> OC(+100/+700) Graphics Score: 16817 -- Overall: 16161
> 
> I ran it a few times and got similar results.


try lower mem OC... if the mem is to high clocked it will throttle and you get lower scores.


----------



## dr.Rafi

Zeakie said:


> Mind linking it? Will give it a go on my zotac


----------



## asdkj1740

HyperMatrix said:


> The EVGA FTW3 is $750 if you buy it directly from them. Just use the standard associates discount code.


evga us accepts only us credit cards and us residential address. it has been a total pain for foreigners to buy evga stuffs on evga us official site.
the ebay store of evga rarely got the same deals as in official site of evga us.

it is dissapointed to see evga has not replaced those solid caps on reference pcb to spcaps/poscaps on the front side of the ftw3 pcb (just like fe), compared to pascal ftw and turing ftw. what can you say to $699 tuf from asus, the well known greedy over priced vendor.
i am not impressed with the evga ftw3 pcb on rtx3000, but the cooler of ftw3 seems to be not bad.
for 3*8pins, if ftw is priced around msi gmaing trio, i would go for evga. 

actually colorful pcbs are awesome but the current prices are insane sadly.


----------



## dr.Rafi

here we go





strix3080oc.rom







drive.google.com


----------



## dr.Rafi

strix3080oc.rom







drive.google.com


----------



## Zeakie

dr.Rafi said:


> here we go
> 
> 
> 
> 
> 
> strix3080oc.rom
> 
> 
> 
> 
> 
> 
> 
> drive.google.com


Thanks found it myself .. it reports over 400w but clocks neve go above 2k and way worse score on port royal than stock bios. Strix is a no go on the zotac card


----------



## dr.Rafi

Zeakie said:


> Thanks found it myself .. it reports over 400w but clocks neve go above 2k and way worse score on port royal than stock bios. Strix is a no go on the zotac card


which zotac do you have


----------



## Zeakie

dr.Rafi said:


> which zotac do you have


Trinity oc.


----------



## GTANY

asdkj1740 said:


> evga us accepts only us credit cards and us residential address. it has been a total pain for foreigners to buy evga stuffs on evga us official site.
> the ebay store of evga rarely got the same deals as in official site of evga us.
> 
> it is dissapointed to see evga has not replaced those solid caps on reference pcb to spcaps/poscaps on the front side of the ftw3 pcb (just like fe), compared to pascal ftw and turing ftw. what can you say to $699 tuf from asus, the well known greedy over priced vendor.
> i am not impressed with the evga ftw3 pcb on rtx3000, but the cooler of ftw3 seems to be not bad.
> for 3*8pins, if ftw is priced around msi gmaing trio, i would go for evga.
> 
> actually colorful pcbs are awesome but the current prices are insane sadly.


I agree : the FTW3 PCB looks like a low-end one, especially after Strix PCB comparison.


----------



## dr.Rafi

Zeakie said:


> Quick run of furmark 8xmsaa 4k reported over 335w running it for a min with yt in the back so for anyone power limited on zotac stuck on 320w.. palit bios atleast helps a bit there


what was your max fps on furmark


----------



## Zeakie

dr.Rafi said:


> what was your max fps on furmark


117 for the preset run


----------



## dr.Rafi

Zeakie said:


> 117


that is 4k 8x anti aliasing?!


----------



## jofbig

Hi guys,

I'm a new owner of a TUF RTX3080 OC, and wanted to share with you a little concern I've got regarding the following :










The MLCC capacitors are "touching", especially in the middle part and I was wondering if that could harm the card on long term usage ?
At the moment, nothing to report, and I managed to boost the core clock @+150 without any issues, but the clocks aren't quite high; peaks @2115, but averages @2010/2025 most of times.

Another question; do you think flashing the Strix OC vBIOS onto my card would get it any benefits ? (I gues I could just try, but was wondering  )

Cheers


----------



## dr.Rafi

jofbig said:


> Hi guys,
> 
> I'm a new owner of a TUF RTX3080 OC, and wanted to share with you a little concern I've got regarding the following :
> 
> View attachment 2462040
> 
> 
> The MLCC capacitors are "touching", especially in the middle part and I was wondering if that could harm the card on long term usage ?
> At the moment, nothing to report, and I managed to boost the core clock @+150 without any issues, but the clocks aren't quite high; peaks @2115, but averages @2010/2025 most of times.
> 
> Another question; do you think flashing the Strix OC vBIOS onto my card would get it any benefits ? (I gues I could just try, but was wondering  )
> 
> Cheers


dont worry those 4 caps head are already connected on pcb


----------



## Zeakie

Zeakie said:


> 117





dr.Rafi said:


> that is 4k 8x anti aliasing?!


4k preset 117 fps. 4k 8x mssa 33-36fps


----------



## dr.Rafi

dr.Rafi said:


> that is 4k 8x anti aliasing?!


the preset 4k dont use the 8x antialiasing try choose the resolution and press stress gpu with 8 x antialiasing


----------



## dr.Rafi

Zeakie said:


> 4k preset 117 fps. 4k 8x mssa 33-36fps


ok i get 38 max even my cpu is 3900x max boosting 4.25 gig


----------



## asdkj1740

GTANY said:


> I agree : the FTW3 PCB looks like a low-end one, especially after Strix PCB comparison.


and when you see the aorus pcb, you will then feel good on ftw3 lol.
i dont think pcb componenets affect the performance in any significant way, but i dont want to pay the markup for nothing.


----------



## Zelo

Chrisch said:


> try lower mem OC... if the mem is to high clocked it will throttle and you get lower scores.


That worked thanks! Lowered it to +100/+500 and got 18645 gfx and 17585 overall time spy score. Should I just keep the OC there for best gaming performance?


----------



## jofbig

dr.Rafi said:


> dont worry those 4 caps head are already connected on pcb


Okay, thanks


----------



## Reinhardovich773

*Heads-up guys*!

Latest GPU-Z is out and it adds the ability to save/export Ampere GPUs vBIOSes. Grab it here: TechPowerUp GPU-Z v2.35.0 Released


----------



## dr.Rafi

asdkj1740 said:


> and when you see the aorus pcb, you will then feel good on ftw3 lol.
> i dont think pcb componenets affect the performance in any significant way, but i dont want to pay the markup for nothing.


will affect the longivity of the card with overclocking


----------



## dr.Rafi

Zelo said:


> That worked thanks! Lowered it to +100/+500 and got 18645 gfx and 17585 overall time spy score. Should I just keep the OC there for best gaming performance?


So correct, i lowered to -500 and not affecting 4k gaming fps its only have couple fps drob when use lower resolution


----------



## asdkj1740

dr.Rafi said:


> will affect the longivity of the card with overclocking


i dont think so.


----------



## zhrooms

Reinhardovich773 said:


> *Heads-up guys*!
> 
> Latest GPU-Z is out and it adds the ability to save/export Ampere GPUs vBIOSes. Grab it here: TechPowerUp GPU-Z v2.35.0 Released


Finally! 









TechPowerUp


Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.




www.techpowerup.com













TechPowerUp


Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.




www.techpowerup.com





Keep an eye there, all the BIOSes should appear in the next few days!


----------



## Chrisch

Zeakie said:


> Thanks found it myself .. it reports over 400w but clocks neve go above 2k and way worse score on port royal than stock bios. Strix is a no go on the zotac card


a bios from a card with 3x8 pins makes no sense on a card with 2x8 pin because it reports higher consumption and finally you have less power as with the stock bios.


----------



## Zeakie

dr.Rafi said:


> ok i get 38 max even my cpu is 3900x max boosting 4.25 gig





Chrisch said:


> a bios from a card with 3x8 pins makes no sense on a card with 2x8 pin because it reports higher consumption and finally you have less power as with the stock bios.


Yeah learned the hard way. Palit/tuf bios seems to work best. Havent tried the new gigabyte one posted there.


----------



## Vapochilled

dr.Rafi said:


> Try Asus Strix oc trust me , miracle withoverclocking , dont use tuf it wont do anything


Would not work. 3x pin vs 2


----------



## DStealth

HyperMatrix said:


> 3 fps is only 1 fps higher than 2 fps. 3 fps is also 50% more fps than 2 fps. So this card is approximately 9% faster than your card.


Was just joking, actually it less than 8% faster with these 4fps difference meanwhile consuming ~60% more energy. 280vs450w. Anyway will do shunt mod also...but for the moment games and Portroyal undervolted 0.9v with 55+fps is more than enought for air use while waiting WB to come.


----------



## bobby_b

jofbig said:


> Hi guys,
> 
> I'm a new owner of a TUF RTX3080 OC, and wanted to share with you a little concern I've got regarding the following :
> 
> View attachment 2462040
> 
> 
> The MLCC capacitors are "touching", especially in the middle part and I was wondering if that could harm the card on long term usage ?
> At the moment, nothing to report, and I managed to boost the core clock @+150 without any issues, but the clocks aren't quite high; peaks @2115, but averages @2010/2025 most of times.
> 
> Another question; do you think flashing the Strix OC vBIOS onto my card would get it any benefits ? (I gues I could just try, but was wondering  )
> 
> Cheers


I don´t know about the bios, but you don´t have to worry about the capacitors! Everything is fine on that picture. If you watch closely, you´ll see that they are connected either way by the little square copper points/plates on the pcb. So, if those caps on that copper plates have contact to each other, it´s no problem.


----------



## KingEngineRevUp

zhrooms said:


> maybe refresh the page before you start quoting?


Uh no, I'm not going to refresh every single time I'm about to reply, no one does that regularly and neither do you.




zhrooms said:


> Also, maybe use your brain for once?


You first. This conversation started because of you and your poor chose of words, not me.


----------



## Db_11

Will get straight to the point .. Here in Australia the 3080 FTW Ultra can be purchased for $1259 AUD whereas the 3080 Strix OC is $1799 AUD .. So there is a $540 AUD price difference which makes the Asus card the worst value possible.
But even if they were priced similarly I still would not pick the Asus card .. let me explain. Having had experience with both Asus and Gigabyte .. cant speak of others .. they will not cover under warranty the following:
Physical Damage
Physical Tampering
Overclocking Damage of any nature

The above are without exception so EVGA's policy is worth more than is immediately obvious if you want to explore any of the above. For me personally, I would never attempt modifications with other brands unless you're prepared to lose the entire purchase cost. Anyone who claims the other brands will fix cards tampered with are really full of it ... something you all here should seriously consider.

Be happy with what you have and try to preserve your cards to enjoy your gaming without pushing them too far given the above caveats. Truth is in real world gaming they really arent too far apart .. more so in Australia especially given the price disparity.


----------



## Vapochilled

MangixZ said:


> New vBIOS for AORUS GeForce RTX™ 3080 MASTER 10G available now.
> 
> 
> 
> 
> 
> AORUS GeForce RTX™ 3080 MASTER 10G (rev. 1.0) 图片展示 | 显卡 - GIGABYTE 技嘉科技
> 
> 
> 玩家推荐AORUS游戏显卡汇集WINDFORCE 风之力散热系统、严选超频GPU、炫彩RGB灯、三防保护涂层, 与专为VR优化的接口设计, 给你极致震撼的游戏效能及VR体验！
> 
> 
> 
> 
> www.gigabyte.cn
> 
> 
> 
> 
> 
> Can anyone try and give feedback.


I think its the same crap 370W....

Release notes say : 

Optimize Fan Curve
For F1 BIOS Flash


----------



## doom26464

I just snagged a msi gaming X trio. Not the card I had my heart set on but with supply where its at Id take whatever I could get my hands on. 

This card seems to be very limited by its power limit. Triple 8 pin and the cooler really does an impressive job. Sucks MSi power limited this card so bad. I get like 340w-350w max on it with clocks around mid 1900s. 

Anyone had success flashing the card with a different bios? Is msi going to make a better vbios for this thing? 

Will flashing the vbios void my warranty too?


----------



## Dreams-Visions

Vapochilled said:


> I think its the same crap 370W....
> 
> Release notes say :
> 
> Optimize Fan Curve
> For F1 BIOS Flash


I mean, it can't get much higher than 370W out of 2x 8-pins, bruh. That's the sacrifice that comes with them limiting their 3x 8-pin variant to only the Auros* Extreme*.

Big missed opportunity to compete better vs the FTW Ultra and Strix. Especially given their asking price.


----------



## Mucho

But I think, pulling 450W for 10fps isn´t the way to go allday. For playing around a bit and OC it´s okay. I think running the card in UV with 280W - 300W is much better.


----------



## Mucho

-


----------



## spajdr

Aand back to Eagle OC bios, nothing interesting gained for me by using other bioses.


----------



## BugFreak

Has anyone tried the new eVGA Beta BIOS yet? Looks to unlock it to 450w...

eVGA Forums Link


----------



## criminal

I need a better CPU.










Anyway, I am glad I got a card!


----------



## MikeGR7

doom26464 said:


> I just snagged a msi gaming X trio. Not the card I had my heart set on but with supply where its at Id take whatever I could get my hands on.
> 
> This card seems to be very limited by its power limit. Triple 8 pin and the cooler really does an impressive job. Sucks MSi power limited this card so bad. I get like 340w-350w max on it with clocks around mid 1900s.
> 
> Anyone had success flashing the card with a different bios? Is msi going to make a better vbios for this thing?
> 
> Will flashing the vbios void my warranty too?


Listen friend, i don't know your budget meaning a Strix/FTW3 are obviously a bit better, but if you planned on taking anything else i can assure you the TRIO is the best choice you could have made.
It has a beastly cooler and more importantly 3X8 PCie that gives you way more flexibility in bios flashing!
I had FE, TUF OC, TUF and TRIO. MSI's card is the only from that bunch that works great with Strix OC vBios with truly stable frequencies above 2100+ using it right now.

Flash StrixOC and enjoy


----------



## freejak13

BugFreak said:


> Has anyone tried the new eVGA Beta BIOS yet? Looks to unlock it to 450w...
> 
> eVGA Forums Link


Just ran it through port royal. Got a 200 point bump from it. https://www.3dmark.com/pr/400911


----------



## jofbig

bobby_b said:


> I don´t know about the bios, but you don´t have to worry about the capacitors! Everything is fine on that picture. If you watch closely, you´ll see that they are connected either way by the little square copper points/plates on the pcb. So, if those caps on that copper plates have contact to each other, it´s no problem.


This seems to be the case, so I shouldn't worry about this. Many thanks for your explanation


----------



## MikeGR7

Btw any good soul tried the new FTW3 bios on TRIO ?
I need to know if TRIO is able to run 3200rpm on it's cooler because Strix bios steals us 200rpm 😋


----------



## BluePaint

doom26464 said:


> I just snagged a msi gaming X trio. Anyone had success flashing the card with a different bios?


Best bios is tested so far is the Strix bios. Gives you 450W PT. Alternatively the new EVGA FW3 bios with 450W PT but haven't tried that yet.
19969 GPU in Time Spy MSI Trio + Strix bios

The Trio is not the greatest card and a bit overpriced but it's pretty quite out of the box (I also don't have coil whine) and the 3 8xpin connectors give u the possibility to use a bios from different 3x8 cards, which is great. For daily use and gaming, the higher PT is not that important because the chip get's quite inefficient at > 300W but gives you piece of mind.

As long as you manage to flash original bios back, u won't loose warranty. If u can't flash it back, u might have a problem.


----------



## KingEngineRevUp

criminal said:


> I need a better CPU.
> 
> View attachment 2462063
> 
> 
> Anyway, I am glad I got a card!


That's not your cpu, that has more to do with other issues.


----------



## doom26464

BluePaint said:


> Best bios is tested so far is the Strix bios. Gives you 450W PT. Alternatively the new EVGA FW3 bios with 450W PT but haven't tried that yet.
> 19969 GPU in Time Spy MSI Trio + Strix bios
> 
> The Trio is not the greatest card and a bit overpriced but it's pretty quite out of the box (I also don't have coil whine) and the 3 8xpin connectors give u the possibility to use a bios from different 3x8 cards, which is great. For daily use and gaming, the higher PT is not that important because the chip get's quite inefficient at > 300W but gives you piece of mind.
> 
> As long as you manage to flash original bios back, u won't loose warranty. If u can't flash it back, u might have a problem.


Thanks for the info! Im a bit nervous about bios flashing as never done it before and the msi card only has a single bios on it so seems a bit risky. You dont loose rgb, fan control or display ports or any other issues when flashing to the strix bios? 

Ill have to put this card through time spy to see if all the extra juice and heat is worth the performance. 

I paid MSRP plus a 100 bucks for 2 year in store warranty. So not the end of the world. Wanted a ftw3 or strix but beggars cant be chooser at this point in time. Also has to be better then the ventus and other 2 pin cards at least? Though other 2 pin cards seems to have power limits to 370w where msi limited this to 340-350w :/


----------



## criminal

doom26464 said:


> I just snagged a msi gaming X trio. Not the card I had my heart set on but with supply where its at Id take whatever I could get my hands on.
> 
> This card seems to be very limited by its power limit. Triple 8 pin and the cooler really does an impressive job. Sucks MSi power limited this card so bad. I get like 340w-350w max on it with clocks around mid 1900s.
> 
> Anyone had success flashing the card with a different bios? Is msi going to make a better vbios for this thing?
> 
> Will flashing the vbios void my warranty too?


Awesome! Glad you finally got one.


----------



## BluePaint

doom26464 said:


> You dont loose rgb, fan control or display ports or any other issues when flashing to the strix bios?
> Ill have to put this card through time spy to see if all the extra juice and heat is worth the performance.


bios flashing isn't too risky. Just don't do it while you have a new CPU/RAM overclock or sth. I always boot stock clocks for flashing.
Should you really mess up your card with the bios, you can usually salvage it by booting with a 2nd GPU (or iGPU) so that u can flash the dead card which is another PCIE slot.

Fan control is similar between Strix and Trio, so u don't loose fan-off mode for example. RGB I haven't checked (don't care). DP port u might loose cause Strix has 2xHDMI. Haven't checked.

For daily gaming the additional PT is really not that relevant. Before I had the Strix bios, the 100W less PT gave me just about 2 or maybe 3% less performance for 3dmark benches. Except for benching, I prefer undervolting + quiet operation.

Better 'pure' GPU test is Port Royale btw. Time Spy is more dependent on CPU + RAM, so you will see less difference between stock + OC GPU.


----------



## doom26464

BluePaint said:


> bios flashing isn't too risky. Just don't do it while you have a new CPU/RAM overclock or sth. I always boot stock clocks for flashing.
> Should you really mess up your card with the bios, you can usually salvage it by booting with a 2nd GPU (or iGPU) so that u can flash the dead card which is another PCIE slot.
> 
> Fan control is similar between Strix and Trio, so u don't loose fan-off mode for example. RGB I haven't checked (don't care). DP port u might loose cause Strix has 2xHDMI. Haven't checked.
> 
> For daily gaming the additional PT is really not that relevant. Before I had the Strix bios, the 100W less PT gave me just about 2 or maybe 3% less performance for 3dmark benches. Except for benching, I prefer undervolting + quiet operation.
> 
> Better 'pure' GPU test is Port Royale btw. Time Spy is more dependent on CPU + RAM, so you will see less difference between stock + OC GPU.


So for 100w more you get 2-3% more performance in synthetics. I imagine in gamming its like 1% fps uplift for the extra juice. 

Doesnt seem worth it too me. I use my pc for streaming too so noise and heat are factors. 

Probaly see what little core clock and menory i can squeek with the stock bios and call it a day. Tuning the fan curve at this point seems like the best investment in my time.


----------



## Chrisch

i need a waterblock for my TUF 



https://www.3dmark.com/pr/401035













https://www.3dmark.com/spy/14571857


----------



## criminal

KingEngineRevUp said:


> That's not your cpu, that has more to do with other issues.


You were right. I had Gsync on. 🤦‍♂️


----------



## gerardfraser

rambosbff said:


> Yeah, but we're tweakers and another tweaker got me tweakin'. Lol. Thanks for the response though, I think I'll just try to enjoy it now, but what's the fun of that without trying to break it first?


I am a tweaker also,I tried to break 3 RTX 3080


----------



## cstkl1




----------



## Colonel_Klinck

Hey all been reading through the thread. I have a TUF OC on order, not too far back in the queue. Card is going on water as soon as EK get the blocks out. Question is should I stick with the TUF OC or upgrade to the Strix while I still can? Monitor is a LG 48CX so more fps the better. I'm guessing the only difference between the regular Strix and OC is power limit in bios?


----------



## cstkl1

Colonel_Klinck said:


> Hey all been reading through the thread. I have a TUF OC on order, not too far back in the queue. Card is going on water as soon as EK get the blocks out. Question is should I stick with the TUF OC or upgrade to the Strix while I still can? Monitor is a LG 48CX so more fps the better. I'm guessing the only difference between the regular Strix and OC is power limit in bios?


If on water. Upgrade to strix. Tuf not worth watercooling


----------



## KingEngineRevUp

Colonel_Klinck said:


> Hey all been reading through the thread. I have a TUF OC on order, not too far back in the queue. Card is going on water as soon as EK get the blocks out. Question is should I stick with the TUF OC or upgrade to the Strix while I still can? Monitor is a LG 48CX so more fps the better. I'm guessing the only difference between the regular Strix and OC is power limit in bios?


If you're going to do water, you might as well go all out right?


----------



## Celeras

Do XC3s usually get BIOS updates from eVGA historically? Would love the fake XOC version for mine


----------



## eeroo94

My 3080 Ventus seems to be actually a solid chip, managed to break 19k graphics score with 320w limit.


----------



## Colonel_Klinck

KingEngineRevUp said:


> If you're going to do water, you might as well go all out right?


Thanks, the OC worth the extra or can I just flash the OC bios on the standards card?


----------



## Zelo

eeroo94 said:


> My 3080 Ventus seems to be actually a solid chip, managed to break 19k graphics score with 320w limit.
> 
> View attachment 2462088


Nice what OC do you have on it?


----------



## doom26464

eeroo94 said:


> My 3080 Ventus seems to be actually a solid chip, managed to break 19k graphics score with 320w limit.
> 
> View attachment 2462088


That seems very high for the very power limited ventus. 

Whats your overclock like.

Doesnt a bone stock 3080fe usually hit a 17.5ish range graphic score? Or high 1700s?


----------



## eeroo94

Zelo said:


> Nice what OC do you have on it?


+120 core/500 memory I believe. It was doing 1995 Mhz with only 0.925v.


----------



## KingEngineRevUp

eeroo94 said:


> +120 core/500 memory I believe. It was doing 1995 Mhz with only 0.925v.





https://www.3dmark.com/spy/14577791



You have an air conditioner blowing inside this thing? 50C is pretty low in temperatures lol. I imagine portions of the test you were under 50C and getting some pretty high boost clocks.


----------



## Zeakie

www.3dmark.com/3dm/51717081? Here's me score with a zotac and a quick curve.. more tinkering tomorrow but I reckon I can break 19k


----------



## Zelo

Not bad scores with a +100/+500 OC. I think I'll leave the OC at that and be happy gaming for a while.


----------



## Db_11

cstkl1 said:


> View attachment 2462083


Whats the purpose of this picture ? .. Are you claiming this is your setup ? .. You havent provided any narrative.


----------



## Talon2016

Chrisch said:


> i need a waterblock for my TUF
> 
> 
> 
> https://www.3dmark.com/pr/401035
> 
> 
> View attachment 2462077
> 
> 
> 
> 
> https://www.3dmark.com/spy/14571857
> 
> 
> View attachment 2462078


Shunt mod?


----------



## Purple_Light

Does any1 understand what is preventing these cards to scale up with power ?


----------



## acoustic

Had my FTW3 on the STRIX OC Bios. Just flashed the new FTW3 bios. Metro Exodus hits 430-440watt, and pinged 450 once or twice. I'll probably stick with the new FTW3 BIOS, at least until the card is under water, the difference between the 400w and 450w BIOS is basically negated under sustained load anyway.


----------



## Db_11

Frustration is setting in for most because of a simple issue that cannot be overcome regardless of how many watts you add ... and that issue is thermals.
If certain temperature targets are met then clocks get boosted resulting in better performance .. the opposite occurs as temperatures rise. I have seen advice here claiming "dont watercool your TUF card buy a Strix instead and watercool that" . That may be good advice when everything else is equal .. but it isnt because strix cards are even rarer than other 3080 cards and at the very top in price so in that context its not sound advice.
For the record, and given the boost/temps tradeoff on nvidia archtecture, a TUF card on watercooling will match and mostly exceed any current 3080 on air assuming silicone on that card falls within the median .. why ? .. because it will be afforded additional frequency boost longer in game and testing by keeping thermals below 50c .. and it doesnt require more voltage than it currently has to do so .. at or below those temps 2.2ghz is the target on boosts.
So if you can grab a TUF card with waterblock for the same price as a Strix on air then you will lack for nothing in performance against it ... power is not the issue with these cards its thermals and the inherent throttling as a consequence so its a balancing act to flatten the frequency curve by normalising thermals to get the best performance.
For what its worth, GamersNexus put a 3080 FTW card under liquid nitrogen .. the most extreme cooling .. and that card achieved a core clock of just over 2.4ghz with huge voltage .. so thats the best case scenario with unrealistic cooling and voltage.
If youre currently getting 2ghz on air stable day to day or thereabouts then save your money and invest in memory or cpu upgrades where you will benefit from additional performance gains easily by optimising performance of all subsystems which has an overall complimentary benefit to total performance.
Hope that helps ... Cheers


----------



## cstkl1

Db_11 said:


> Frustration is setting in for most because of a simple issue that cannot be overcome regardless of how many watts you add ... and that issue is thermals.
> If certain temperature targets are met then clocks get boosted resulting in better performance .. the opposite occurs as temperatures rise. I have seen advice here claiming "dont watercool your TUF card buy a Strix instead and watercool that" . That may be good advice when everything else is equal .. but it isnt because strix cards are even rarer than other 3080 cards and at the very top in price so in that context its not sound advice.
> For the record, and given the boost/temps tradeoff on nvidia archtecture, a TUF card on watercooling will match and mostly exceed any current 3080 on air assuming silicone on that card falls within the median .. why ? .. because it will be afforded additional frequency boost longer in game and testing by keeping thermals below 50c .. and it doesnt require more voltage than it currently has to do so .. at or below those temps 2.2ghz is the target on boosts.
> So if you can grab a TUF card with waterblock for the same price as a Strix on air then you will lack for nothing in performance against it ... power is not the issue with these cards its thermals and the inherent throttling as a consequence so its a balancing act to flatten the frequency curve by normalising thermals to get the best performance.
> Hope that helps ... Cheers


also narrative is Thats how strix looks like with bp block.

Spoken like a dude who doesnt have any card. Long story. 
3rd post right Im the forum.

I got tuf and strix.. so no need to claim
My friend also which i posted the link.

Strix and now looks like ftw3 Ultra.. these two cards atm only worth it to be water-cooled

if and when tuf gets a mod bios.. then only its worth it. Heck it will make strix a useless card to buy.

strix 3090 the same case. Its just nuts on air. 
y because it has a 480w powerlimit. Worth watercooling to get sustain high clocks. This i didnt get tuf because 3080 already proven tuf power limit issue. But now i realize i made a blunder. NVR trust reviews
3090 strix is insane. If i could now i would sell my 3080 strix with wc to buy another 3090 strix. But that fat chance of happening this year. No way can get another 3090 strix. 

@Baasha bro you got to get strix 3090 sli for both your setup bro.


----------



## Db_11

Read my post again .. specifically gamers nexus.
Work for a system builder for what its worth and in my 50's.
Only Nvidia can relax the thermal/clock dependencies with Ampere as thats hardcoded into the architecture .. 2.2ghz is the best you will see working around that short of cooling with phase change or liquid nitrogen.
Advice given was sound .. cheers


----------



## cstkl1

Db_11 said:


> Read my post again .. specifically gamers nexus.
> Work for a system builder for what its worth and in my 50's.
> Advice given was sound .. cheers


Read every post i ever wrote, got warned and got deleted. Not gonna start another round about the dumbass trio which steve belongs to.

only worth watercooling cards thar hit vrel. Not power. Strix example sustains easy on 2100-2070 and its reduced because of temp. Vrel. Not power

tuf just hits everything. The stock temp of tuf so low its just a well build card. Atm not worth wc.

if you are from a system builder.. australia..age 50s.. credential with posting dude screenshot dude. Which system builder.. i can even make a guess. australia aint a big country but rather you state it.

Hard to swallow anything with a well written story.


----------



## Db_11

Ive left emotion out of it .. clearly you havent but thats ok as well. We have noticed overall system gains of up to almost 10% simply by optimising memory and cpu subsystems in our builds which simply cannot be achieved by pushing the graphics subsystem any further so it creates and emphatically supports the law of diminishing returns.
I posted what i have to afford a different perspective as total system performance .. and its complimentary benefits to all subsystems .. is often overlooked with enthusiasts.
I welcome your view, and fully support your enthusiasm, because after all i would be looking for a new job if it werent for passionate people like yourself.


----------



## cstkl1

Db_11 said:


> Ive left emotion out of it .. clearly you havent but thats ok as well. We have noticed overall system gains of up to almost 10% simply by optimising memory and cpu subsystems in our builds which simply cannot be achieved by pushing the graphics subsystem any further so it creates and emphatically supports the law of diminishing returns.
> I posted what i have to afford a different perspective as total system performance .. and its complimentary benefits to all subsystems .. is often overlooked with enthusiasts.
> I welcome your view, and fully support your enthusiasm, because after all i would be looking for a new job if it werent for passionate people like yourself.


again long story. almost yawned.
1. known fact. hence y its overclock.net. not barney and friends.net lol
2. no emotion here. but defending with long story seems emotional

but lets stop this btw. before i get banned on this forum for good.

post dude. whats backs up the claim on tuf wc is worth it other than a long story, claims of age long wisdom...you digressed already. why should a person spend 30% of the price of tuf to buy a waterblock (shipping included)


----------



## Alemancio

Purple_Light said:


> Does any1 understand what is preventing these cards to scale up with power ?


Temps


----------



## Celeras

doom26464 said:


> That seems very high for the very power limited ventus.
> 
> Whats your overclock like.
> 
> Doesnt a bone stock 3080fe usually hit a 17.5ish range graphic score? Or high 1700s?


My XC3 Ultra is also a 2pin that never gets above 340W, and stock was 17.1 and OC'd around 19.2.


----------



## dentnu

I just managed to secure a 3080 FE from best buy earlier today. It’s not the card I wanted but guess it will do for few a few months till supply improve. I was honestly shocked I was able to get one. I have a two questions which probably been asked already hundreds of times but don't have the time to read all the pages.

1. Do any of the other bios like the Strix or FTW3 work with the FE?

2. Anyone here mange to put an FE underwater if so how much head room as that gained you is it worth it?


----------



## trippinonprozac

Quick question to the gurus - I have applied a decent under volt and is working well but since doing the custom curve I am finding the card doesnt clock back in Windows. Currently sits at 1800mhz core and above .875v

Why is it that it wont return to a normal state in 2d?


----------



## cstkl1

hmm wonder where this dude db_11 from claiming stuff

zero record in xs
for someond who claims old. 4th post on


Db_11 said:


> Will get straight to the point .. Here in Australia the 3080 FTW Ultra can be purchased for $1259 AUD whereas the 3080 Strix OC is $1799 AUD .. So there is a $540 AUD price difference which makes the Asus card the worst value possible.
> But even if they were priced similarly I still would not pick the Asus card .. let me explain. Having had experience with both Asus and Gigabyte .. cant speak of others .. they will not cover under warranty the following:
> Physical Damage
> Physical Tampering
> Overclocking Damage of any nature
> 
> The above are without exception so EVGA's policy is worth more than is immediately obvious if you want to explore any of the above. For me personally, I would never attempt modifications with other brands unless you're prepared to lose the entire purchase cost. Anyone who claims the other brands will fix cards tampered with are really full of it ... something you all here should seriously consider.
> 
> Be happy with what you have and try to preserve your cards to enjoy your gaming without pushing them too far given the above caveats. Truth is in real world gaming they really arent too far apart .. more so in Australia especially given the price disparity.


interesting. you sure you are 50s and system builder

lets start with some BS.
asus pulled out from all countries office and now its being handled by third party except singapore in whole asia pac.
before ...
rma , damaged etc all can be repaired with just paying money.
now its up to the third party or just rma directly to taiwan and get the repair cost.
two - three years ago asus started warranty sticker
waterblock on gpu warantty. if you bought from premium reseller no problem. NO f given . no issue.
end user direct to third party asus office MYR 75..
for premium resellers for rma something with mobo even damaging the board. heck even rma delided cpu after killing it no issue for intel plat partner.
everything else just get the costing. but what i dont like with direct taiwan you will be placed in global que system. and i think now depends on the component some rma send to china.

msi. msi in malaysia is MSI from taiwan. FAE is here. know the guy for 10-15 years. no issue rmaing anything just like asus before third party. watercooling no problem. damage component just pay repair cost. no physical damage can get 1:1 depending on stock or credit note to be used in any reseller.

gigabyte. ah this where my FULL Hate steams from. will not cover anythimg even if their components killed another due to bad qc. wont even repair. local rep and taiwan. so middle finger.

zotac. all depends on distro/reseller. here no problem just like msi.

galax my experience with them once only 8800gtx. killed it with wc. no physical damage. no problem.

evga american company.. same like corsair and ocz and bfg of the past. thet are excellent on cs and practise whatever they do in the US worldwide.


----------



## cstkl1

dentnu said:


> I just managed to secure a 3080 FE from best buy earlier today. It’s not the card I wanted but guess it will do for few a few months till supply improve. I was honestly shocked I was able to get one. I have a two questions which probably been asked already hundreds of times but don't have the time to read all the pages.
> 
> 1. Do any of the other bios like the Strix or FTW3 work with the FE?
> 
> 2. Anyone here mange to put an FE underwater if so how much head room as that gained you is it worth it?


i suspect inforom or that nvflash by inno3d is incomplete

just suspect, theres no guarantee on flashing tuf to get 370w

but for odd reason ftw3, trio can be flashed with strix

so
1. if the power config is in inforom.. is the tuf version protected? i could not enable back protection on tuf cause it says date mismatch hence the inforom was nvr flashed wheb i tried tuf oc bios
2. strix etc is not protected on inforom??
3. is that nvflash the full version?? or do we need a mod version like in turing..

just assumption. nothing concrete. alot of questions. thats y i am staying away from flashing

need to uncompile that giga bios and evga bios flashing utility and see which nvflash its using.


----------



## Db_11

I have said what i said and stand by it .. and did so in a calm and rational way. I have not at any stage resorted to any shaming tactics or derogatory remarks as would seem to be the case with yourself but I do appreciate that all people are different. Furthermore, I explained the policy of Asus and Gigabyte and made no further comments on others as clearly highlighted in my post.
When you achieve a better frequency than 2.2Ghz on any 3080/90 on air or water cooling, given the current restrictions of Ampere architecture, feel free to post here and i will gladly stand corrected.
No matter how passionate you are, and how indignant you are to other posters here in the forum, facts are facts and the rest is speculation and unjustified inuendo. I posted what my thoughts were .. something for others to consider. At no stage did I attempt to ram it down anyone's throat.
Having a mature discussion, and comparing how you and I have acquitted ourselves to that end .. I will simply leave others here to decide for themselves.
I appreciate the passion but no one here should be intimidated due to that passion. You raise some interesting points and kudos for doing so. It isnt your knowledge .. its your approach ... in the end we all here want the same thing.
Hope that clears things up ... Cheers


----------



## MangixZ

cstkl1 said:


> i suspect inforom or that nvflash by inno3d is incomplete
> 
> just suspect, theres no guarantee on flashing tuf to get 370w
> 
> but for odd reason ftw3, trio can be flashed with strix
> 
> so
> 1. if the power config is in inforom.. is the tuf version protected? i could not enable back protection on tuf cause it says date mismatch hence the inforom was nvr flashed wheb i tried tuf oc bios
> 2. strix etc is not protected on inforom??
> 3. is that nvflash the full version?? or do we need a mod version like in turing..
> 
> just assumption. nothing concrete. alot of questions. thats y i am staying away from flashing
> 
> need to uncompile that giga bios and evga bios flashing utility and see which nvflash its using.


At present, none of the 2 * 8pin cards can really reach 370W.


----------



## Db_11

One thing that troubles me with Nvidia is how much more performance can they squeeze out of Ampere with drivers. I can distinctly remember going back 19 years or so when they released Detonator drivers that yielded a 20% to 40% performance improvement to their Geforce 2 offerings when AMD were threatening their dominance. Given they already have that history, then it can be presumed they may have some headroom to play with .. speculation of course but backed up by history.
It will be interesting to see how things play out with new Radeons on the imminent horizon.
Also, it is generally uncommon that a lithography shrink yields higher power consumption as is currently being experienced with Ampere .. interesting times ahead but we will all know soon enough which is good news for everyone.


----------



## cstkl1

Db_11 said:


> I have said what i said and stand by it .. and did so in a calm and rational way. I have not at any stage resorted to any shaming tactics or derogatory remarks as would seem to be the case with yourself but I do appreciate that all people are different. Furthermore, I explained the policy of Asus and Gigabyte and made no further comments on others as clearly highlighted in my post.
> When you achieve a better frequency than 2.2Ghz on any 3080/90 on air or water cooling, given the current restrictions of Ampere architecture, feel free to post here and i will gladly stand corrected.
> No matter how passionate you are, and how indignant you are to other posters here in the forum, facts are facts and the rest is speculation and unjustified inuendo. I posted what my thoughts were .. something for others to consider. At no stage did I attempt to ram it down anyone's throat.
> Having a mature discussion, and comparing how you and I have acquitted ourselves to that end .. I will simply leave others here to decide for themselves.
> I appreciate the passion but no one here should be intimidated due to that passion. You raise some interesting points and kudos for doing so. It isnt your knowledge .. its your approach ... in the end we all here want the same thing.
> Hope that clears things up ... Cheers


ok maybe wrong footing here. restart.

whats your claims dude. you been story telling and doing reverse intimidation when questioned by actually standing higher than others with a know it all take or leave it comment.
so restart.

you still havent showed anything.
whats data/screenshots you have to show tuf wc is worth it.

lets start with this.

then proceed
do you have any of the cards??


----------



## cstkl1

MangixZ said:


> At present, none of the 2 * 8pin cards can really reach 370W.


tuf. close 36x spikes 350 constant
spikes cause 6x is pcie power limit


----------



## parcher

Zeakie said:


> www.3dmark.com/3dm/51717081? Here's me score with a zotac and a quick curve.. more tinkering tomorrow but I reckon I can break 19k


Whit the Palit OC upgrade BIOS ? [emoji848]

Inviato dal mio RNE-L21 utilizzando Tapatalk


----------



## nootnoot

Has anyone flashed a Founders card with another brands bios? I searched through quite a few pages and didn't see anyone mentioning it.


----------



## deerwasquick

Flashed a non oc zotac with the tuf oc bios and wasn’t able to push past 330w. Not sure if it’s a hardware limit for some of the cards or what.

Has anyone been able to turn the eeprom protect back on (—protecton command in nvflash)? I’m getting an error when I try to when I flash back to stock bios.
@Zeakie tagging you because I saw you have a zotac card.


----------



## hemon

Guys, how many FPS-difference @1440p there could be with the 3080 Strix and +100W compared to the TUF? With undervolting my TUF stays stable at about 1950-1980Mhz. I suppose that the Strix has +100Mhz, right? So, again: how many FPS @1440p is that?


----------



## Boham_CY

Hey everyone, my Palit 3080 is getting here Monday, was wondering what's the best bios for it in terms of upping the power limit?
Fingers crossed on silicon lottery


----------



## VPII

Boham_CY said:


> Hey everyone, my Palit 3080 is getting here Monday, was wondering what's the best bios for it in terms of upping the power limit?
> Fingers crossed on silicon lottery


My friend, do me a favour, when you get your card please run Time Spy stock with the increased 9% power limit but with the on screen display for msi afterburner so you can check where your clock speeds start dropping. I have an issue with mine as the 9% increased power limit does not give you additional performance as clocks still drop at 320watt and higher and therefore none of the 7 or 8 bioses I tried made any difference. However, if you get the card do the Time Spy run please, then you can try the Asus Tuf oc bios and see if that changes things for you.


----------



## KingEngineRevUp

hemon said:


> Guys, how many FPS-difference @1440p there could be with the 3080 Strix and +100W compared to the TUF? With undervolting my TUF stays stable at about 1950-1980Mhz. I suppose that the Strix has +100Mhz, right? So, again: how many FPS @1440p is that?


Like 2 frames. Maybe 3. You think that's a joke don't you? It's not.


----------



## Zeakie

parcher said:


> Whit the Palit OC upgrade BIOS ? [emoji848]
> 
> Inviato dal mio RNE-L21 utilizzando Tapatalk


This is on the tuf bios I'm switching between the 2 bios see which one i can squeeze the most out of


----------



## Zeakie

deerwasquick said:


> Flashed a non oc zotac with the tuf oc bios and wasn’t able to push past 330w. Not sure if it’s a hardware limit for some of the cards or what.
> 
> Has anyone been able to turn the eeprom protect back on (—protecton command in nvflash)? I’m getting an error when I try to when I flash back to stock bios.
> @Zeakie tagging you because I saw you have a zotac card.


I have the oc variant with palit/tuf bios. Flashing back to stock oc rom works no issue


----------



## Adrian76

deerwasquick said:


> Flashed a non oc zotac with the tuf oc bios and wasn’t able to push past 330w. Not sure if it’s a hardware limit for some of the cards or what.
> 
> Has anyone been able to turn the eeprom protect back on (—protecton command in nvflash)? I’m getting an error when I try to when I flash back to stock bios.
> @Zeakie tagging you because I saw you have a zotac card.


It does work, Ignore the error NVFLASH is ****ed, Run the protect command then try to flash any other bios again and it says can't flash because protection is on.


----------



## Adrian76

Zotac non OC TUF OC bios is best so far, Slider to 109% works but doesn't change power, Average power just over 330w seen it spike to 354w which never happens with Zotac bios.

Tried Palit bios but no change in power and fans are ****ed, Fans work fine with TUF OC bios but more aggressive than Zotac stock bios.


----------



## Zeakie

Adrian76 said:


> Zotac non OC TUF OC bios is best so far, Slider to 109% works but doesn't change power, Average power just over 330w seen it spike to 354w which never happens with Zotac bios.
> 
> Tried Palit bios but no change in power and fans are ****ed, Fans work fine with TUF OC bios but more aggressive than Zotac stock bios.


No fan issues on my oc variant really wonder how different our cards are


----------



## Adrian76

Zeakie said:


> No fan issues on my oc variant really wonder how different our cards are


Yeah it's weird, You can move the slider up but it does nothing to the fans then all of a sudden at a certain level the fans jump in at high speed, Doesn't do this with TUF OC bios.

Hmm strange I think everyone is under the illusion the OC Trinity was the same as the non OC but a slightly higher base clock but if your not seeing that on your OC variant maybe they have changed something? Who know what.

Even the power on the Palit bios is the same as the Zotac with the slider not doing anything to increase but what I noticed with TUF OC bios was higher wattage and higher clocks in games default even though again the slider wasn't doing anything, Didn't try much else, Didn't seem worth it to me unless we can get 350+w average.


----------



## Zeakie

Adrian76 said:


> Yeah it's weird, You can move the slider up but it does nothing to the fans then all of a sudden at a certain level the fans jump in at high speed, Doesn't do this with TUF OC bios.
> 
> Hmm strange I think everyone is under the illusion the OC Trinity was the same as the non OC but a slightly higher base clock but if your not seeing that on your OC variant maybe they have changed something? Who know what.
> 
> Even the power on the Palit bios is the same as the Zotac with the slider not doing anything to increase but what I noticed with TUF OC bios was higher wattage and higher clocks in games default even though again the slider wasn't doing anything, Didn't try much else, Didn't seem worth it to me unless we can get 350+w average.


Fan control was like any other bios fan slider worked at any %. Tuf bios my power slider works let's the card average around 340w 350w instead of the 320ish . Performance is marginal but big enough difference to make me stay well clear of original bios


----------



## cstkl1

hemon said:


> Guys, how many FPS-difference @1440p there could be with the 3080 Strix and +100W compared to the TUF? With undervolting my TUF stays stable at about 1950-1980Mhz. I suppose that the Strix has +100Mhz, right? So, again: how many FPS @1440p is that?


strix pretty damn close to a stock 3090

but strix 3090 is like.. ....


----------



## Adrian76

Zeakie said:


> Fan control was like any other bios fan slider worked at any %. Tuf bios my power slider works let's the card average around 340w 350w instead of the 320ish . Performance is marginal but big enough difference to make me stay well clear of original bios


Yeah strange, Somethings changed, Until there is more stock and we see teardowns comparing both board we won't know.


----------



## cstkl1

Adrian76 said:


> It does work, Ignore the error NVFLASH is ****ed, Run the protect command then try to flash any other bios again and it says can't flash because protection is on.


mine didnt do that on tuf with tuf oc

the error y protect didnt work cause inforom date mismatch


----------



## Adrian76

cstkl1 said:


> mine didnt do that on tuf with tuf oc
> 
> the error y protect didnt work cause inforom date mismatch


That's weird considering it's the same card, I haven't flashed any inforom on this Zotac just the bios but I haven't had any error about the inforom either, I get an error from NVFLASH saying eeprom can't be locked but it must be doing it here because i've just tried it again now and it's saying it's protected and I have flashed other bios's see here


http://imgur.com/a/LUagewR


----------



## shiokarai

cstkl1 said:


> also narrative is Thats how strix looks like with bp block.
> 
> Spoken like a dude who doesnt have any card. Long story.
> 3rd post right Im the forum.
> 
> I got tuf and strix.. so no need to claim
> My friend also which i posted the link.
> 
> Strix and now looks like ftw3 Ultra.. these two cards atm only worth it to be water-cooled
> 
> if and when tuf gets a mod bios.. then only its worth it. Heck it will make strix a useless card to buy.
> 
> strix 3090 the same case. Its just nuts on air.
> y because it has a 480w powerlimit. Worth watercooling to get sustain high clocks. This i didnt get tuf because 3080 already proven tuf power limit issue. But now i realize i made a blunder. NVR trust reviews
> 3090 strix is insane. If i could now i would sell my 3080 strix with wc to buy another 3090 strix. But that fat chance of happening this year. No way can get another 3090 strix.
> 
> @Baasha bro you got to get strix 3090 sli for both your setup bro.


yeah yeah yeah... week ago you were raving about how strix oc is bad, useless card and only TUF is worth it, shunt it etc. just act somewhat coherent man, not like a 5 year-old kid running around and shouting. Honestly...


----------



## Colonel_Klinck

cstkl1 said:


> If on water. Upgrade to strix. Tuf not worth watercooling


Spoke to OCUK this morning and now they've sent out queue positions they aren't letting people swap cards. Its now a cancel and new order situation and for the Strix that would mean February/March. Going to stick with the TUF OC, it has to go on water as my case is the Tower 900 with dual loop, its not an air-cooling case unless I don't put the sides on.


----------



## cstkl1

shiokarai said:


> yeah yeah yeah... week ago you were raving about how strix oc is bad, useless card and only TUF is worth it, shunt it etc. just act somewhat coherent man, not like a 5 year-old kid running around and shouting. Honestly...


when did i change my position about tuf

i said atm only strix is worth wc.
but if tuf gets a mod bios strix is a useless card.

guess you need stop jumping about like a 2 year old who won something.


----------



## cstkl1

Colonel_Klinck said:


> Spoke to OCUK this morning and now they've sent out queue positions they aren't letting people swap cards. Its now a cancel and new order situation and for the Strix that would mean February/March. Going to stick with the TUF OC, it has to go on water as my case is the Tower 900 with dual loop, its not an air-cooling case unless I don't put the sides on.


if theres a mod bios before that. you be laughinng at all those (me included) other cards


----------



## Nizzen

hemon said:


> Guys, how many FPS-difference @1440p there could be with the 3080 Strix and +100W compared to the TUF? With undervolting my TUF stays stable at about 1950-1980Mhz. I suppose that the Strix has +100Mhz, right? So, again: how many FPS @1440p is that?


How long is a rope?


----------



## Purple_Light

Have you guys tried the nvflash 5.665.0 someone have extracted from the palit bios update earlier ?(around page 35 somewhere)


----------



## cstkl1

Asus Strix 3080 + Bitspower + 75/1000 stable

Testing first round stable.. bad card.. .. will source for another...


Spoiler

























10900k - SP81
M12E - 098
52|49 - 1.374 L6 v/f
4600 [email protected] 
Asus Strix 3080 +75/1250


Benchmark











https://www.3dmark.com/spy/14589635




https://www.3dmark.com/pr/404502



disappointment on da luck


----------



## Purple_Light

shALKE said:


> nvflash64 Version 5.665.0
> 
> 
> 
> 
> 
> nvflash64_Version_5.665.zip
> 
> 
> 
> 
> 
> 
> 
> drive.google.com
> 
> 
> 
> 
> I extracted it from the Palit bios update. The bios file is password protected.


Ok found the post... Did some1 tried this nvflash version ?


----------



## MikeSanders

Does anyone know the PCB of the Manli 3080:






Manli GeForce RTX™ 3080 - 10GB (M3478+N613)


New NVIDIA Ampere Architecture. Next-generation RT and Tensor Cores with twice the throughput. 19 Gbps GDDR6X memory.



www.manli.com





?
Is it a Ref?


----------



## Chrisch

New BIOS for 3080 TUF (OC) and maybe STRIX (8 rom files in this flashtool)


> BIOS Update Tool
> Improved compatibility for 0db fan feature





https://dlcdnets.asus.com/pub/ASUS/Graphic%20Card/NVIDIA/BIOSUPDATE_TOOL/3080biosupdate.zip



direct dl for TUF OC (performance) BIOS



TUF_OC_Update.rom beim Filehorst - filehorst.de


----------



## Zeakie

www.3dmark.com/3dm/51742232? 19k I can smell you.. for a zotac shes pushing well on air.


----------



## spajdr

MikeSanders said:


> Does anyone know the PCB of the Manli 3080:
> 
> 
> 
> 
> 
> 
> Manli GeForce RTX™ 3080 - 10GB (M3478+N613)
> 
> 
> New NVIDIA Ampere Architecture. Next-generation RT and Tensor Cores with twice the throughput. 19 Gbps GDDR6X memory.
> 
> 
> 
> www.manli.com
> 
> 
> 
> 
> 
> ?
> Is it a Ref?


I'm surprised that company still exists. They made worst possible quality components for the PC in the past.


----------



## cstkl1

spajdr said:


> I'm surprised that company still exists. They made worst possible quality components for the PC in the past.


miners. where they exist.. these cards thrive.


----------



## owntecx

Chrisch said:


> New BIOS for 3080 TUF (OC) and maybe STRIX (8 rom files in this flashtool)
> 
> 
> 
> https://dlcdnets.asus.com/pub/ASUS/Graphic%20Card/NVIDIA/BIOSUPDATE_TOOL/3080biosupdate.zip
> 
> 
> 
> direct dl for TUF OC (performance) BIOS
> 
> 
> 
> TUF_OC_Update.rom beim Filehorst - filehorst.de
> 
> 
> 
> View attachment 2462148


Does it get to the 375w limit now?


----------



## XxXSpitfireXxX

Has anyone noticed changes in boost algorithms in the 456.71 drivers? I used to be stable at +95 core / +700 mem and after the patch the memory is unstable over +600 and reduces all performance meanwhile core goes up to +145 before seeing degradation in benchmarks or CTDs. They might have changed the way the boost clocks regulate themselves as not to spike too high from the average or something like that in my opinion.

I’m using a Gigabyte Gaming OC. Anyone else here have this card?


----------



## dr.Rafi

Vapochilled said:


> Would not work. 3x pin vs 2


It works for me and the story of 3 pin bios on 2 pin card report higher watt but real watt is less its not true story, i have digital corsair h1600i power supply and I can Check the diference in wattage use and also my ventus with asus strix bios and also test it with aorus master bios is going higher in stable clocks .


----------



## Vapochilled

dr.Rafi said:


> It works for me and the story of 3 pin bios on 2 pin card report higher watt but real watt is less its not true story, i have digital corsair h1600i power supply and I can Check the diference in wattage use and also my ventus with asus strix bios and also test it with aorus master bios is going higher in stable clocks .


You mean to say, that on a 2x pin card, you flashed a 3pin bios, and you can draw above 400w now?
Is this confirmed?
Thanks


----------



## Zeakie

Vapochilled said:


> You mean to say, that on a 2x pin card, you flashed a 3pin bios, and you can draw above 400w now?
> Is this confirmed?
> Thanks


I tried strix bios on zotac trinity.. 3 pin bios on 2pin card. Wattage shown 440w pulled while gaming. Way worse fps than stock and wouldnt boost or hit clocks right. My timespy was 10k strix bios compared to 12k on tuf bios no matter what oc or underc I did the strix bios gimmped me card more


----------



## gemini002

owntecx said:


> Does it get to the 375w limit now?


Yes just tested max is 375.3


----------



## owntecx

gemini002 said:


> Yes just tested max is 375.3


Nice. Got to test myself later today.
Its +- 375 sustained or spike?


----------



## gemini002

XxXSpitfireXxX said:


> Has anyone noticed changes in boost algorithms in the 456.71 drivers? I used to be stable at +95 core / +700 mem and after the patch the memory is unstable over +600 and reduces all performance meanwhile core goes up to +145 before seeing degradation in benchmarks or CTDs. They might have changed the way the boost clocks regulate themselves as not to spike too high from the average or something like that in my opinion.
> 
> I’m using a Gigabyte Gaming OC. Anyone else here have this card?


yes I rolled back


----------



## gemini002

owntecx said:


> Nice. Got to test myself later today.
> Its +- 375 sustained or spike?


sustained


----------



## Zeakie

gemini002 said:


> sustained


Ill have to flash it on zotac as I'm already running the tuf oc bios atm will report back see for any improvement


----------



## DStealth

This new TufOC bios on my palit is holding 340w limit no matter extended power limit set to 370w ..not bad for 0.92v stok air cooled Palit nonOC version assume.


----------



## Zeakie

www.3dmark.com/3dm/51751552?
My gpu score went up by 100 just by flashing it
Power now at 340 350w under port royal load less fluctuating than before seems to be lil more stable on clocks too


----------



## slopokdave

Haha oops.... i'm in 3080 on accident, lol. disregard.


----------



## doom26464

I just started on overclocking msi trio. Have not touched memory yet but at a +100 stable on the core no problem. Threw a bunch of games at it and benches and so far no crash. I can bench +110 but crashes in warzone. 

Boost up to like [email protected] and time spy score is 18.2k graphics. Seems pretty good for core alone? 

Il dial memory more this weekend.


----------



## owntecx

Well i tried the new vbios on the tuf oc, same soft limit 340w, with bigger spikes


----------



## martinhal

Is there much difference between the Palit Pro OC and MSI Ventus OC ? What one should I choose ?


----------



## Zeakie

Ventus apparently ran the strix bios fine according to a user here which would be a huge advantage over the palit if what he claims is true


----------



## gemini002

owntecx said:


> Well i tried the new vbios on the tuf oc, same soft limit 340w, with bigger spikes


You have to check voltage in afterburner


----------



## owntecx

gemini002 said:


> You have to check voltage in afterburner


Voltage slider? hwinfo, gpuz and afterburner both show around 350s max power under load on heaven benchmark


----------



## Professor McNasty

Has anyone been able to flash the XOC bios for the FTW3 Ultra on a normal FTW3? Does it do the same thing?


----------



## Zeakie

Www.3dmark.com/3dm/51761735?
Best I got for tonight ... almost there .. any other zotac users hitting near this on gpu score?quite cold nights in Ireland atm rooms freezing for anyone asking about why temps are low.. under sustained full load and big overclock it stays around 50c and 60c day time with fans 100%


----------



## Talon2016

Professor McNasty said:


> Has anyone been able to flash the XOC bios for the FTW3 Ultra on a normal FTW3? Does it do the same thing?


They're the same GPU, just different vBIOS. Yes you can flash the FTW3 XOC vBIOS to a normal FTW3.


----------



## Professor McNasty

Talon2016 said:


> They're the same GPU, just different vBIOS. Yes you can flash the FTW3 XOC vBIOS to a normal FTW3.


Thanks. I was 99% certain this was the case but I just wanted to make sure.


----------



## rambosbff

I've been having fun maxing out my 3080 gigabyte gaming oc in port royal. I can get it to 12170, but running into pwr limits. Anyone venture out on testing another bios for this card yet? Just curious to see if a little power limit beyond 100 would help out. It's a 370w card.


----------



## gerardfraser

ssgwright said:


> Why you in this thread dumping on everyone... you do know what this forum is for.... I mean the literal name of the site is overclock... in my opion you either 1. can't get your hands on a 3080 or 2. you did and it's a dud.


Your right I did not get one,you hurt my feeling,I am so sad. LOL Nizzen liked your post. Oh I got 3 RTX 3080's and my feeling are not hurt.
Well you can not be all that bad calling out the other guy csktil or whatever his name is.Good job and I will add rep for you.


----------



## dev1ance

Zeakie said:


> Www.3dmark.com/3dm/51761735?
> Best I got for tonight ... almost there .. any other zotac users hitting near this on gpu score?quite cold nights in Ireland atm rooms freezing for anyone asking about why temps are low.. under sustained full load and big overclock it stays around 50c and 60c day time with fans 100%


I have a Galax, similar PCB because only the Palit OC BIOs worked well while everything else wasn't great.
Think cracking 19k would require 370w BIOs of some sort because I'm severely power limited

18,840


https://www.3dmark.com/spy/14585682


18,839


https://www.3dmark.com/spy/14583377



12,260 in Port Royal


----------



## kobs

I'm eyeing a EVGA Geforce RTX 3080 FTW3 Ultra as soon as I can get my hands on one.... good or bad choice?


----------



## bungusbeefcake

Hi, I'm looking at getting the Aorus Xtreme and I'm wondering about the published dimensions. It says it is 319mm, but does anyone know if that includes the PCIe bracket at the front or not. I ask because space in my case is tight, and if it includes the bracket then that allows me an extra 11mm, so it should fit without having to move/modify my pump/res, on the otherhand, if it doesn't then I'll have to do some Macgyvering....can anyone help? What are the dimensions on your cards and does it include the extra length on the bracket?
Thanks


----------



## doom26464

Just hit 18.5k in time spy with +100 core and +500 memory. 

Memory is tricky how are people overclocking? At what point is it starting to hurt performance? 

I see when its pushed too far can really hurt 1% lows in games which is the most important stat IMO.


----------



## cstkl1

Chrisch said:


> New BIOS for 3080 TUF (OC) and maybe STRIX (8 rom files in this flashtool)
> 
> 
> 
> https://dlcdnets.asus.com/pub/ASUS/Graphic%20Card/NVIDIA/BIOSUPDATE_TOOL/3080biosupdate.zip
> 
> 
> 
> direct dl for TUF OC (performance) BIOS
> 
> 
> 
> TUF_OC_Update.rom beim Filehorst - filehorst.de
> 
> 
> 
> View attachment 2462148


this should be in the first page..

uncompress that update tool also and u will find a PROPER FULL nvlfash 5.667.0


----------



## shredy44

doom26464 said:


> Just hit 18.5k in time spy with +100 core and +500 memory.
> 
> Memory is tricky how are people overclocking? At what point is it starting to hurt performance?
> 
> I see when its pushed too far can really hurt 1% lows in games which is the most important stat IMO.


keep increasing mem 50-100mhz til your performance drops then back it off...
i find 500-700 is the sweet spot.


----------



## shredy44

kobs said:


> I'm eyeing a EVGA Geforce RTX 3080 FTW3 Ultra as soon as I can get my hands on one.... good or bad choice?


yeah i reckon... slap on the evga 450watt bios when you get it and happy days!
challenge is to keep them cool on air! lol


----------



## doom26464

I


shredy44 said:


> keep increasing mem 50-100mhz til your performance drops then back it off...
> i find 500-700 is the sweet spot.


I pushed to +900 on memory and then got too 18.6k timespy. This is quite tricky to tweak memory.


----------



## gemini002

owntecx said:


> Nice. Got to test myself later today.
> Its +- 375 sustained or spike?


on second look it was spike avg was 355-365


----------



## gemini002

Chrisch said:


> i need a waterblock for my TUF
> 
> 
> 
> https://www.3dmark.com/pr/401035
> 
> 
> View attachment 2462077
> 
> 
> 
> 
> https://www.3dmark.com/spy/14571857
> 
> 
> View attachment 2462078


what settings for OC?


----------



## rambosbff

rambosbff said:


> I've been having fun maxing out my 3080 gigabyte gaming oc in port royal. I can get it to 12170, but running into pwr limits. Anyone venture out on testing another bios for this card yet? Just curious to see if a little power limit beyond 100 would help out. It's a 370w card.


Yeah idk man gfy.


----------



## VPII

Well for a card that is basically limted to 320watt even with the +9% power limit you get with this Palit RTX 3080 GamingPro OC this is not too bad seen that I got a 19103 graphics score. Had to increase the core clock by +165mhz for 2220mhz effective core clock and +750 memory.



https://www.3dmark.com/spy/14439322


----------



## VPII

I found something interesting. Look I've only sued my memory overclock at +750 for all my runs. So I decided to try +1000watt, and interestingly I saw during the run the the FPS is way lower than usual but the core clock stayed above 2000mhz during the entire run with power draw below 300watt and just over 300watt in certain sections. Now look at the difference.

Here with +1000 memory - look at the average clock speed


https://www.3dmark.com/3dm/51773755?



Here with +800 memory - again look at the average clock speed


https://www.3dmark.com/3dm/51773898?



These are with my daily clocks using a vcurve I modified.


----------



## gemini002

Flashes Strix OC onto my tuf board. reports 450W but performance is worse as clocks are lower no matter the settings. Back to tuf oc bios. I have a trio coming on Monday have people put the Strix OC bios on it and got better performance?


----------



## Falkentyne

gemini002 said:


> Flashes Strix OC onto my tuf board. reports 450W but performance is worse as clocks are lower no matter the settings. Back to tuf oc bios. I have a trio coming on Monday have people put the Strix OC bios on it and got better performance?


Please read the thread.
You can NOT flash a 3 pin bios onto a 2 pin board. It won't work properly.


----------



## gemini002

Falkentyne said:


> Please read the thread.
> You can NOT flash a 3 pin bios onto a 2 pin board. It won't work properly.


the thread is 75 pages hence why I asked. So the Asus Strix Bios works well on Trio?


----------



## ssgwright

best I can do I think on air... 12,581 port http://www.3dmark.com/pr/407929


----------



## DStealth

It works but with 3*8pin card BIOS the 2 pin cards get even more power limited as the BIOS reads pci-e 1 +pci-e 2 + pci-e 3 lines while 3th does not exist and just adds artificial power to the total Board TDP. Long story short 320w bios for 2 8pins will give 1900-1950mhz runs during 3dmarks where even 450w 3*8pin BIOS will suffer to get over 1900.


----------



## dr.Rafi

doom26464 said:


> I just snagged a msi gaming X trio. Not the card I had my heart set on but with supply where its at Id take whatever I could get my hands on.
> 
> This card seems to be very limited by its power limit. Triple 8 pin and the cooler really does an impressive job. Sucks MSi power limited this card so bad. I get like 340w-350w max on it with clocks around mid 1900s.
> 
> Anyone had success flashing the card with a different bios? Is msi going to make a better vbios for this thing?
> 
> Will flashing the vbios void my warranty too?


it will but you can flash the original bios before sending back unless the card completly died with none factory bios then you have 1 option to buy rom flashing tool and flash back the original bios


----------



## dr.Rafi

doom26464 said:


> So for 100w more you get 2-3% more performance in synthetics. I imagine in gamming its like 1% fps uplift for the extra juice.
> 
> Doesnt seem worth it too me. I use my pc for streaming too so noise and heat are factors.
> 
> Probaly see what little core clock and menory i can squeek with the stock bios and call it a day. Tuning the fan curve at this point seems like the best investment in my time.


water cooled mine never pass 45 c for hours of gaming and benching


----------



## dr.Rafi

Purple_Light said:


> Does any1 understand what is preventing these cards to scale up with power ?


Nvidia did their homework so well with the power of this family so they can sell 3090 card otherwise 3080 can easly reach 3090 performance.
Any higher power bios flashed to certain 3080 card either is miss reading the power consumption, or that extra power juice is not feeded to gpu rather thanother component or features on board i have hx1600i power supply and i checked which bios is really sucking more power


----------



## shredy44

Frame chasers found a way to flash the 450watt bios on the XC3 card!


----------



## VPII

So I decided to limit my clocks, stock it would go up to 2040 to 2055mhz this Palit RTX 3080 Gamingpro OC, but seen that I am basically 320watt power limited even with the +9% power limit increase I wanted to see what I can do and at what vcore to keep the core speed as high as possible. I was pretty impressed with the 2015mhz at 0.918v onwards and 2010mhz at 0.906 to 0.912v. So the card is limited at 2025mhz core max as you'll see from the added link, but I love the fact that my average clock speed is only 52mhz lower than the max. Using the vcurve to get the best performance seems to be the way to go with this.



https://www.3dmark.com/3dm/51779758?


----------



## martinhal

VPII said:


> So I decided to limit my clocks, stock it would go up to 2040 to 2055mhz this Palit RTX 3080 Gamingpro OC, but seen that I am basically 320watt power limited even with the +9% power limit increase I wanted to see what I can do and at what vcore to keep the core speed as high as possible. I was pretty impressed with the 2015mhz at 0.918v onwards and 2010mhz at 0.906 to 0.912v. So the card is limited at 2025mhz core max as you'll see from the added link, but I love the fact that my average clock speed is only 52mhz lower than the max. Using the vcurve to get the best performance seems to be the way to go with this.
> 
> 
> 
> https://www.3dmark.com/3dm/51779758?


Get in to top 99 % and call it a day


----------



## VPII

martinhal said:


> Get in to top 99 % and call it a day


Yah well, we'll have to see. Look my best result posted earlier was 18628 as posted earlier at the result itself was not too bad except that you are limited with the power limit. But it seems like to way to go with these 320watt power limitted cards.


----------



## ssgwright

shredy44 said:


> Frame chasers found a way to flash the 450watt bios on the XC3 card!


lol this doesn't work don't waste your time... (well on the TUF anyway)


----------



## Nizzen

VPII said:


> Well for a card that is basically limted to 320watt even with the +9% power limit you get with this Palit RTX 3080 GamingPro OC this is not too bad seen that I got a 19103 graphics score. Had to increase the core clock by +165mhz for 2220mhz effective core clock and +750 memory.
> 
> 
> 
> https://www.3dmark.com/spy/14439322


You're PEAK is 2220mhz.... Average clock frequency 1989 MHz.... FAR away from 2200mhz

This is my Palit 3080 gaming OC with stock bios and aircooling.


http://www.3dmark.com/spy/14029645


The powerlimit is REAL


----------



## VPII

Nizzen said:


> You're PEAK is 2220mhz.... Average clock frequency 1989 MHz.... FAR away from 2200mhz
> 
> This is my Palit 3080 gaming OC with stock bios and aircooling.
> 
> 
> http://www.3dmark.com/spy/14029645
> 
> 
> The powerlimit is REAL


Well that is great as you have one of the better model.. Can you do me a favour and save your bios to share with me. I have been in contact with Palit about this and they do not understand what I mean by the limit staying at 320watt even with the increased 9%. PLEASE as I want to try your bios to see if it makes a difference. I can see your card is doing really well so I want to see if your bios maybe helps. PLEASE. If you cannot share online I'll give you my email in a pm.


----------



## martin28bln

ssgwright said:


> lol this doesn't work don't waste your time... (well on the TUF anyway)


Do you really not that it don´t work? Or did you tested this? I didn´t understand him because he shuntmodded one card and said that he additional needed the bios...


----------



## man from atlantis

this guy claims he increased the power limit of XC3 with using the latest FTW3 bios(450W). old 400W bios doesn't do a thing for XC3.


----------



## ssgwright

martin28bln said:


> Do you really not that it don´t work? Or did you tested this? I didn´t understand him because he shuntmodded one card and said that he additional needed the bios...


I tested on my tuff, same result from flashing other 3x8 pin bios on a 2x8 pin card... less power


----------



## ncck

Does anyone have a before/after of the ASUS BIOS using an asus tuf OC? I'm a little scared to update my gpu bios when things have been running smoothly for me


----------



## lowrider_05

ncck said:


> Does anyone have a before/after of the ASUS BIOS using an asus tuf OC? I'm a little scared to update my gpu bios when things have been running smoothly for me


I have flashed the strix oc bios on my tuf oc and it overclocks just the same if not a little worse, so don't bother trying


----------



## Zeakie

lowrider_05 said:


> I have flashed the strix oc bios on my tuf oc and it overclocks just the same if not a little worse, so don't bother trying


Think he means the updated tuf oc bios onto his current tuf oc.


----------



## spajdr

man from atlantis said:


> this guy claims he increased the power limit of XC3 with using the latest FTW3 bios(450W). old 400W bios doesn't do a thing for XC3.


Judging from comments below it does nothing again for Gigabyte cards? anyone tried it on Eagle OC card yet?


----------



## Nizzen

VPII said:


> Well that is great as you have one of the better model.. Can you do me a favour and save your bios to share with me. I have been in contact with Palit about this and they do not understand what I mean by the limit staying at 320watt even with the increased 9%. PLEASE as I want to try your bios to see if it makes a difference. I can see your card is doing really well so I want to see if your bios maybe helps. PLEASE. If you cannot share online I'll give you my email in a pm.


Newest palit 3080 oc bios:


https://www.diskusjon.no/applications/core/interface/file/attachment.php?id=635068


----------



## VPII

Nizzen said:


> Newest palit 3080 oc bios:
> 
> 
> https://www.diskusjon.no/applications/core/interface/file/attachment.php?id=635068


Thank you my brother.... much appreciated. Just want to test if any difference.


----------



## shredy44

ssgwright said:


> lol this doesn't work don't waste your time... (well on the TUF anyway)


the tuf is a custom pcb, get a 3 8pin ref board


----------



## mainyu

[QUOTE = "gemini002、po


gemini002 said:


> the thread is 75 pages hence why I asked. So the Asus Strix Bios works well on Trio?


Yes, the strix bios are fully functional in trio.
The cooling on the trio is great, so the thermal issues are fine. (At least the vrm will be cooler than the zotac 1080ti amp extreme lol)


----------



## doom26464

Keeping these things cool seems to be more benfecial to keep clocks stable more then anything. Chucking a ton more juice at it seems to have poor diminishing returns. 

Also anyone try encoding with there 3080 yet? I been having issues trying to live record warzone gameplay with this thing without massive fps drops my side or stream/recording stuttering. I excpeted nvenc on these things plus the raw fps power to be beast at it but so far its been disponting.


----------



## VPII

VPII said:


> So I decided to limit my clocks, stock it would go up to 2040 to 2055mhz this Palit RTX 3080 Gamingpro OC, but seen that I am basically 320watt power limited even with the +9% power limit increase I wanted to see what I can do and at what vcore to keep the core speed as high as possible. I was pretty impressed with the 2015mhz at 0.918v onwards and 2010mhz at 0.906 to 0.912v. So the card is limited at 2025mhz core max as you'll see from the added link, but I love the fact that my average clock speed is only 52mhz lower than the max. Using the vcurve to get the best performance seems to be the way to go with this.
> 
> 
> 
> https://www.3dmark.com/3dm/51779758?


SO I'll say this again, if you able to modify your vcurve to remain stable at clocks above 2000mhz you can gain a lot. I did the run in the link using a modified vcurve and it worked great with my card'smaximum clocks around 30mhz lower than stock but the result is much much better. I did however find that when I ran Control first at 1080P to check clock stability that it was actually not that stable so I changed it a little again and now it appears to be stable. I decided to try Control at 1440P with all detail maxed out and I was shocked to see that power draw was less than 300watt for most of the time so clocks remained between 2025 and 2010mhz. It really works great.


----------



## cstkl1

shredy44 said:


> Frame chasers found a way to flash the 450watt bios on the XC3 card!


btw u know this dude full of lies right

the card draws what it needs
[email protected] only 400watts water 45-50c

unlock bios or shunt mod doesnt mean the card pulling that amount.


----------



## cstkl1

man from atlantis said:


> this guy claims he increased the power limit of XC3 with using the latest FTW3 bios(450W). old 400W bios doesn't do a thing for XC3.


its lie bro. check owikh84 2145 on strix in water at 1.07-1.08v
only 400 watss gpu clock doesnt drop.


----------



## cstkl1

the gpu pulls what power it needs with the condition if thermals and voltage

same like cpu. we have unlock mobo. 
does our cpu pull 4096 watts?? lol

that youtuber dude just a liar on epic levels.


----------



## asdkj1740

cstkl1 said:


> the gpu pulls what power it needs with the condition if thermals and voltage
> 
> same like cpu. we have unlock mobo.
> does our cpu pull 4096 watts?? lol
> 
> that youtuber dude just a liar on epic levels.


at least i admire him saying evga xc3 cooler is **** lol.
what he has done shows shunt mod on rtx 3000 is problematic.


----------



## cstkl1

asdkj1740 said:


> at least i admire him saying evga xc3 cooler is **** lol.
> what he has done shows shunt mod on rtx 3000 is problematic.


well. i dont like ppl who try to mislead others with youtube to gain popularity just to be placed on pedestal. 

i noticed unsung heroes who actually help all of us but nobody knew what they did. they did alot of walk, no talk. 

theres many of them. 

but yeah 450 watt bios with water should do it. so dont see a reason to shunt mod. get that evga card and watercool it.


----------



## cstkl1

VPII said:


> SO I'll say this again, if you able to modify your vcurve to remain stable at clocks above 2000mhz you can gain a lot. I did the run in the link using a modified vcurve and it worked great with my card'smaximum clocks around 30mhz lower than stock but the result is much much better. I did however find that when I ran Control first at 1080P to check clock stability that it was actually not that stable so I changed it a little again and now it appears to be stable. I decided to try Control at 1440P with all detail maxed out and I was shocked to see that power draw was less than 300watt for most of the time so clocks remained between 2025 and 2010mhz. It really works great.


yeah i know a german dude who doing a tuf [email protected] i think.. running 40s temp .. wattage around 2xx..


----------



## martinhal

cstkl1 said:


> its lie bro. check owikh84 2145 on strix in water at 1.07-1.08v
> only 400 watss gpu clock doesnt drop.


Seems legit


----------



## cstkl1

martinhal said:


> Seems legit











NVIDIA Graphics Card Overclocking V1


Image credits NVIDIAThis is a thread to show off overclocking of NVIDIA graphics card OC. Please share your benchmark, stability, overclocking tips & guides etc...======================================================================Introductions:NVDIA...




forum.lowyat.net





just skip to the last page.. 
all his result there.. this is open test bench in MY.. so you guys should fare even better..


----------



## cstkl1

600watts 2200 with no voltage mod on 3080.. LOL.. thats just BS on the highest level...


----------



## martinhal

I think the only thing that dude did was trick HWinfo into thinking he has three Pcie cables hence an overstated power reading. 40 degrees with 600w on water ...not happening


----------



## cstkl1

martinhal said:


> I think the only thing that the dude did was trick HWinfo into thinking he has three Pcie cables hence an overstated power reading. 40 degrees with 600w on water ...not happening


shunt mod.. all readings are false anyway

btw i

strix 3080 is interesting in idle and pcie power...

TUF pulls 50 constant with 6x peak.. so its hitting the max the pcie can do ( 5v i think.. theres a 3.3v reserved for something hence the 75watt max.. which is why u cannot get 375watt on tuf...)

Strix only 20x watts on load.. 10 watts idle...

strix 3080 total idle is 20 watts
tuf 3080 total idle is 50-60watts


----------



## cstkl1

edit.. tuf 3080 is affected by AB.. without it the idle goes down to 3x...
odd again strix doesnt suffer from this.


----------



## Zemo

mainyu said:


> [QUOTE = "gemini002、po
> 
> Yes, the strix bios are fully functional in trio.
> The cooling on the trio is great, so the thermal issues are fine. (At least the vrm will be cooler than the zotac 1080ti amp extreme lol)


 What about EVGA 450w beta bios, would that be compatible with Trio as it has 3x 8pin?


----------



## mainyu

Zemo said:


> What about EVGA 450w beta bios, would that be compatible with Trio as it has 3x 8pin?


The ftw3 430w bios was working with trio, so I think it works.


----------



## Vapochilled

The guys said he measured the Watts on the wall... and it was pulling more .... so... thats also weird


----------



## cstkl1

https://www.3dmark.com/spy/14623118



if only had a better card and not this ****ty strix.. if that tuf of mine had a mod bios....would have flown atleast touched closer to 20k..


----------



## Zeakie

cstkl1 said:


> https://www.3dmark.com/spy/14623118
> 
> 
> 
> if only had a better card and not this ****ty strix.. if that tuf of mine had a mod bios....would have flown atleast touched closer to 20k..


Relax champ you're beating alot of 3090s with that score


----------



## cstkl1

Zeakie said:


> Relax champ you're beating alot of 3090s with that score


Not da really gpu. 
Its my cpu and rams that pushing the gpu score


----------



## dr.Rafi

Purple_Light said:


> Have you guys tried the nvflash 5.665.0 someone have extracted from the palit bios update earlier ?(around page 35 somewhere)


Me is great no protection, just -6 every time and good to go.


----------



## spajdr

cstkl1 said:


> btw u know this dude full of lies right
> 
> the card draws what it needs
> [email protected] only 400watts water 45-50c
> 
> unlock bios or shunt mod doesnt mean the card pulling that amount.


He actually replied that we are just theorycrafters without even having the actual card


----------



## Alemancio

Serious question, how are people getting their 3080s? Does refreshing Amazon every minute even do anything?


----------



## Riadon

Even though normal 3-pin bios doesn't work with 2-pin cards, I would think a 3-pin XOC bios with a crazy power limit like 900w would still be far better than any 2-pin bios because even if the bios is telling the card to pull 300w per pin, that's still 600w which is far more than any 2-pin bios will ever allow.  Flashing like a 450w 3-pin bios on a 2-pin card though is a downgrade because you'd be getting 150w per pin which would only equal 300w.


----------



## gemini002

Alemancio said:


> Serious question, how are people getting their 3080s? Does refreshing Amazon every minute even do anything?


no new egg Thursday and Friday


----------



## sblantipodi

will we ever see the FE back in stock?


----------



## ncck

Alemancio said:


> Serious question, how are people getting their 3080s? Does refreshing Amazon every minute even do anything?


Micro center in the united states for me


----------



## Alemancio

gemini002 said:


> no new egg Thursday and Friday


Why those days in particular?



ncck said:


> Micro center in the united states for me


In person, right?


----------



## Shadownet_

As someone who has both the 3080 Trio and the TUF, which one should I keep for the best performance? I would give the other one to my buddy who doesn't care.


----------



## ssgwright

I'm bias cause I own a TUF but I would say the build quality of the TUF is better than the Trio


----------



## dev1ance

Shadownet_ said:


> As someone who has both the 3080 Trio and the TUF, which one should I keep for the best performance? I would give the other one to my buddy who doesn't care.


Test both and see which has the better bin. Trio has the benefit of 3x8Pin for high PL BIOs


----------



## ssgwright

isn't the Trio just a 2x8 pin? If it is a three than it's a no-brainer


----------



## Shadownet_

dev1ance said:


> Test both and see which has the better bin. Trio has the benefit of 3x8Pin for high PL BIOs


I'm new to this. Is there a guide on testing and overclocking?


----------



## gemini002

Alemancio said:


> Why those days in particular?
> 
> 
> In person, right?


just when they been dropping


----------



## gemini002

Shadownet_ said:


> As someone who has both the 3080 Trio and the TUF, which one should I keep for the best performance? I would give the other one to my buddy who doesn't care.


In the same boat. I had 3080 Ventus and its severely power locked. Tuf I do like as it's quiet and I get about 1950-2025 with peak 375w but avg is 355-365. I have a Trio coming and will keep that and flash Strix bios and call it a day. I want the FTW or Strix but Trio handles the Strix bios good people saying 2100-2200 boost


----------



## cstkl1

Shadownet_ said:


> As someone who has both the 3080 Trio and the TUF, which one should I keep for the best performance? I would give the other one to my buddy who doesn't care.


keep both
tuf if somebody mods the bios for 450

it be the card of the year to WC. 

msi advantage is flashing strix bios for higher power but its using cheaper components.


----------



## ssgwright

cstkl1 said:


> keep both
> tuf if somebody mods the bios for 450
> 
> it be the card of the year to WC.
> 
> msi advantage is flashing strix bios for higher power but its using cheaper components.


agreed, if the TUF get's a power boost it's just a better built card


----------



## Shadownet_

cstkl1 said:


> keep both
> tuf if somebody mods the bios for 450
> 
> it be the card of the year to WC.
> 
> msi advantage is flashing strix bios for higher power but its using cheaper components.


What are the chances it will get a power boost to 450? Could the cooling of the TUF even handle that much power?

I'm leaning toward keeping the TUF just cause I already have an Asus mobo, it would be easier to control the RGB from there instead of having to deal with MSI Dragon Center/Mystic Light.


----------



## cstkl1

Shadownet_ said:


> What are the chances it will get a power boost to 450? Could the cooling of the TUF even handle that much power?
> 
> I'm leaning toward keeping the TUF just cause I already have an Asus mobo, it would be easier to control the RGB from there instead of having to deal with MSI Dragon Center/Mystic Light.


2080ti strix about similar build as tuf
had a unlimited bios
2100 on it pulling 320watt

3080 u just need arnd 400-420
so should be no issue as its only extra 25-30w on each of the 8pins. so most psu cables can handle that easy. its only when you are arnd 300watts per 8pin.. cables got to be good like 12awg etc..

gpu draws what it needs. so heck just make it open. all the cards are volt limited in the bios anyway so pretty safe.

tuf with 450-500w bios. makes strix looks like garbage overpriced card


----------



## DStealth

@*VPII*
Give the new TufOC BIOS a try ..extended power limit to 375w is not working but at least the base is 340w. On my Palit this gives me the best Superposition score.


----------



## VPII

DStealth said:


> @*VPII*
> Give the new TufOC BIOS a try ..extended power limit to 375w is not working but at least the base is 340w. On my Palit this gives me the best Superposition score.
> View attachment 2462321


Hi there, thank you. I tried so many bioses and went back to the Palit. I'll give you a glips of what my card does now with the vcurve set to not go over 2025mhz core, obviously had to fine tune to get it towork in games as well with which I tried Control at 1080P and 1440P and was surprised that in Control even at 1440P it never drops below 2010mhz and mostly sit at 2025mhz. But here you go.

The first run is with my modified vcurve, the second is the highest I managed.


----------



## biopster88

cstkl1 said:


> Not da really gpu.
> Its my cpu and rams that pushing the gpu score


What ram model (part nr) are you running? I see g skill in your 3dmark result


----------



## DStealth

But how is possible to hit 350w limit @ 0.9v only ?!? What is going on...


----------



## MrBridgeSix

DStealth said:


> But how is possible to hit 350w limit @ 0.9v only ?!? What is going on...
> View attachment 2462328


Time Spy Extreme can go as low as 0.837v and still use 350W.


----------



## DStealth

Right Palit OC updated(350w) BIOS is best for the moment using your setting got the highest result also


----------



## cstkl1

biopster88 said:


> What ram model (part nr) are you running? I see g skill in your 3dmark result


Its irrelevant


----------



## Zeakie

DStealth said:


> Right Palit OC updated(350w) BIOS is best for the moment using your setting got the highest result also
> View attachment 2462330


Exact same score on my trinity oc on tuf bios


----------



## ncck

Alemancio said:


> Why those days in particular?
> 
> 
> In person, right?


Yes in person, you go before store opening and wait in line. Then they hand out vouchers in the morning and you can purchase it. Sometimes they don't have any cards, other times they have several. So best to be there early. I recommend Thursdays and Fridays. You may have to go twice


----------



## acoustic

Got mine at Microcenter on a Friday too.

Lucky to have a store that happens to get a lot of inventory..


----------



## JonnyV75

How’s this for Port Royal and Superposition scores on a FTW3 Ultra? Not a dud?

For Port Royal this was with +150 core, +1000 memory and both power and voltage sliders at max. And using the new 450w bios.



http://www.3dmark.com/pr/409128


----------



## acoustic

Not bad. Memory going up to +1000 is good; my FTW3 has terrible memory chips, I max at +500. I'd double check that +1000 isn't dropping performance due to error-checking. I pulled a 12648 (in the 12600 area) on the STRIX OC bios, but that was only at my 24/7 clocks (+45core/+400mem) and higher ambient temp. If you can get the temps down, then you'd probably see another 100 or more points. You might also find that +150 isn't actually stable as the thermal throttling is causing core clocks to drop, and if it wasn't thermal throttling it might end up unstable.

You could also try dropping the voltage slider to 0 and running it again. Sometimes that slider causes more throttling and causes more issues than anything else.

I need to turn the AC on and do some benching, but I don't care all that much for right now. Once I put the card under water, I'll go all-out for a day and see what the card can manage.


----------



## lowrider_05

JonnyV75 said:


> How’s this for Port Royal and Superposition scores on a FTW3 Ultra? Not a dud?
> 
> For Port Royal this was with +150 core, +1000 memory and both power and voltage sliders at max. And using the new 450w bios.
> 
> 
> 
> http://www.3dmark.com/pr/409128
> 
> 
> 
> 
> View attachment 2462342


It all comes down to luck in the silicon lottery and temperature my TUF OC can score the same but could do much more with a higher Powerlimit

https://www.3dmark.com/pr/383292


----------



## JonnyV75

acoustic said:


> Not bad. Memory going up to +1000 is good; my FTW3 has terrible memory chips, I max at +500. I'd double check that +1000 isn't dropping performance due to error-checking. I pulled a 12648 (in the 12600 area) on the STRIX OC bios, but that was only at my 24/7 clocks (+45core/+400mem) and higher ambient temp. If you can get the temps down, then you'd probably see another 100 or more points. You might also find that +150 isn't actually stable as the thermal throttling is causing core clocks to drop, and if it wasn't thermal throttling it might end up unstable.
> 
> You could also try dropping the voltage slider to 0 and running it again. Sometimes that slider causes more throttling and causes more issues than anything else.
> 
> I need to turn the AC on and do some benching, but I don't care all that much for right now. Once I put the card under water, I'll go all-out for a day and see what the card can manage.


Thx. I’ll try with voltage at 0. 

The 150 isn’t stable outside of Port Royal - crashes in TimeSpy. I get normal use stability at +110 with games/VR/MadVR and the SuperPosition score is actually with +110 on the core.

Regrading memory, I was increasing in intervals of +100 and getting incremental score improvements until +1000. 

Regarding temps, if open up the side of my Phanteks Evolv X case I can get temps down to high 50s. I’ll be adding EVGA’s Hybrid cooler once EVGA makes one available. Looking forward to that in November (and the 5900x).


----------



## VPII

I find it strange seeing people talk about +100 or +110core speed increase when you obviously need to work with 15mhz increments. So for a +100 you basically sit with +90mhz core and +110 with a +105mhz core increase. Look I bring this up as it is one thing that really irritates me when I look at some of these famous youtubers that does it and in the same breath they'll talk about 15mhz increments. Come on.


----------



## acoustic

+110 is rounded to +120, I believe. I've never actually tested if it'll round down. The same way Turing functions, the card increases in 15Mhz increments.

I'd be careful with EVGAs hybrid cooler. I know they're finally going beyond the 120mm rad for the separate purchase, but even with a 240mm .. I feel like the rad will still end up heat soaked and water-temp will skyrocket. I had the Hybrid on my 2080TI FTW3 Ultra and it was great for benching, but with sustained load I was hitting low 70s. That's with a 373w BIOS. My 3080 FTW3 Ultra in Metro Exodus is pushing 425-450watt consistently. Never drops under 425w.

I'm not entirely sure a 240mm rad will be able to handle that kind of heat. The 120mm surely couldn't handle 370watts.


----------



## JonnyV75

VPII said:


> I find it strange seeing people talk about +100 or +110core speed increase when you obviously need to work with 15mhz increments. So for a +100 you basically sit with +90mhz core and +110 with a +105mhz core increase. Look I bring this up as it is one thing that really irritates me when I look at some of these famous youtubers that does it and in the same breath they'll talk about 15mhz increments. Come on.


I did do increments of 15 (90, 105, 120, 135, 150 ... etc). However, I tried 5mhz increments afterwards to see if I got score improvements - and I did so I left it and used it. If that bugs you. So be it.


----------



## VPII

JonnyV75 said:


> I did do increments of 15 (90, 105, 120, 135, 150 ... etc). However, I tried 5mhz increments afterwards to see if I got score improvements - and I did so I left it and used it. If that bugs you. So be it.


No worries, then so be it.. Sorry just wanted to state the obvious but taken that +5mhz gave you additional performance then great.


----------



## VPII

acoustic said:


> +110 is rounded to +120, I believe. I've never actually tested if it'll round down. The same way Turing functions, the card increases in 15Mhz increments.
> 
> I'd be careful with EVGAs hybrid cooler. I know they're finally going beyond the 120mm rad for the separate purchase, but even with a 240mm .. I feel like the rad will still end up heat soaked and water-temp will skyrocket. I had the Hybrid on my 2080TI FTW3 Ultra and it was great for benching, but with sustained load I was hitting low 70s. That's with a 373w BIOS. My 3080 FTW3 Ultra in Metro Exodus is pushing 425-450watt consistently. Never drops under 425w.
> 
> I'm not entirely sure a 240mm rad will be able to handle that kind of heat. The 120mm surely couldn't handle 370watts.


I had an Nzxt Kraken G12 with a Corsair H110 AIO cooler, thus 280mm, on my RTX 2080 Ti. This card was flashed with the XOC 1000 watt bios which worked great but the gpu would never, and I mean never be more than 44 to 45c. The cooling worked great, but unfortunately with the RTX 2000 series you lose clocks firstly at around 38c, then 41c and there after I cannot comment, but that is what I saw.


----------



## JonnyV75

Follow up on the temps. I take about a 1000 point hit in Port Royal when Precision X is open. It has to be closed for a decent score. However, when closed, the FTW3 reverts to its default fan profile and I’m no longer running fans at 100% and I get mid 60s average temp.

Anyone else notice this with Precision X? Workaround to get high fan speeds with it closed?


----------



## acoustic

Get rid of PX1 and use Afterburner. I gave up with that garbage program. Constant stability issues, settings not saving, etc. It's pathetic that the developers behind PX1 haven't been able to make a program even remotely close to competing with Afterburner after all these years.

The OC BIOS switch automatic fan control is close enough to 1:1, and you can manually control 2 of the 3 fans to 100% when benching.


----------



## Shadownet_

Well my buddy is asking for me to give him his GPU today. I haven't done any overclocking/undervolting yet.

Trio vs TUF. Which one should I keep? I'll give him the other one.


----------



## Nizzen

Shadownet_ said:


> Well my buddy is asking for me to give him his GPU today. I haven't done any overclocking/undervolting yet.
> 
> Trio vs TUF. Which one should I keep? I'll give him the other one.


Pretty much the same both cards. Performance is within 1% an acustics too.


----------



## DarknightOCR

Hello. I don't know if it has already been spoken in the middle of all these pages. but is it possible to put the XOC bios in a msi ventus? does the powerlimit go up, or is it the same at 320W? 

will it help to have 2000mhz clocks? 
will be for use with waterblock

Thanks


----------



## VPII

DarknightOCR said:


> Hello. I don't know if it has already been spoken in the middle of all these pages. but is it possible to put the XOC bios in a msi ventus? does the powerlimit go up, or is it the same at 320W?
> 
> will it help to have 2000mhz clocks?
> will be for use with waterblock
> 
> Thanks


What is the power limit for your card? I have a Palit and even with the 9% increased power limit it still drops clocks when reaching 320watt. So what I have done is to modify the vcurve in MSI Afterburner and now I basically get 2010 to 2025mhz all the time while running Control at 1440P max settings. You'll need to play around to see where your card at which voltage would still be stable at the speeds above 2000mhz, the lower the vcore the better. At present I am sitting at 912 to 918mv for 2010mhz and 925 to 943 for 2025mhz, but it works for me.


----------



## ScorpMCP

DarknightOCR said:


> Hello. I don't know if it has already been spoken in the middle of all these pages. but is it possible to put the XOC bios in a msi ventus? does the powerlimit go up, or is it the same at 320W?
> 
> will it help to have 2000mhz clocks?
> will be for use with waterblock
> 
> Thanks


I tried most of the 2 pin bioses with a ventus, capped out at about 337w , with all bioses, even the 370w bioses. A little higher than the 320w stock bios, maybe something else is limiting it.


----------



## bobby_b

Shadownet_ said:


> Well my buddy is asking for me to give him his GPU today. I haven't done any overclocking/undervolting yet.
> 
> Trio vs TUF. Which one should I keep? I'll give him the other one.


For real? Keep the TUF! Higher Powerlimit, MLCC´s (better for oc), dual BIOS and more output ports 2x hdmi + 3x dp


----------



## spajdr

I can have Gainward nonGS version, keep that and sell Eagle OC? not much reviews about nonGS version yet.


----------



## DarknightOCR

i don't have the gpu yet. only on Tuesday it arrives.
I am asking to know if I really have the card, or if I try another model / brand. from what I see the ventus oc bios is 320w.
but it is very difficult to stock any 3080


----------



## martin28bln

spajdr said:


> I can have Gainward nonGS version, keep that and sell Eagle OC? not much reviews about nonGS version yet.


Have a non GS under water and happy with it. Only very very little coil wine. Got also a chip on the better side....can run [email protected] or [email protected] Would the decision make depending which chip is better and can run with less voltage.


----------



## bobby_b

spajdr said:


> I can have Gainward nonGS version, keep that and sell Eagle OC? not much reviews about nonGS version yet.


All cards with a powerlimit under 350 watt are pretty much the same. The only difference in terms of better/worse for those cards is the chip lottery. Eagles 340 watt pl vs gainwards 350 watt pl doesn´t make a real difference in terms of performance. I would probably keep the eagle because of the second hdmi. If you have the chance to test both, do it and then decide. Undervolting results can differ.

So, performance-wise = same. Decide by the looks, the features, undervolting capability or the cost of those cards.


----------



## bobby_b

DELETE


----------



## Mucho

spajdr said:


> I can have Gainward nonGS version, keep that and sell Eagle OC? not much reviews about nonGS version yet.


It´s the same card like the Palit GamingPro. You can flash the GS Bios and get some MHz in the boost. PL is at 350W. It`s Ref PCB. I´m running a Palit OC with waterblock an really happy with the card. It runs at 1920MHz/850mV. A higher PL would be nice, but 350 is ok.


----------



## turkishmafia

A few questions for Founder Edition owners please.

1) Does anyone have any pictures of their vertically-mounted Founder's Edition?
2) Is the "v-shaped" LED on the surface of the GPU on both sides of the card or just on the side that has "RTX 3080/90" written on it?
3) Does the plastic face-plate that has the model of the card written on it seem removable? Ideally, I would like to move this to the backside of the GPU so that it is visible when vertically-mounted.

Thanks.


----------



## Mucho




----------



## gemini002

Shadownet_ said:


> Well my buddy is asking for me to give him his GPU today. I haven't done any overclocking/undervolting yet.
> 
> Trio vs TUF. Which one should I keep? I'll give him the other one.


give him tuf trio more flashing options. I do like that tuf has 2 bios switches and 2 hdmi 2.1 Up to you based on what ur needs are.


----------



## dr.Rafi

My best result, just to put in mind iam using water block on gpu only with fan and heatinks on memory and power stages for my ventus 3080, flashed with Aorus master rom, Iam losing around 300 points because iam using pci express generation 3 raiser cable and have to lock it to gen 3 in motherbaord bios setting otherwise its crashing the nvidia sound and make the graphic slugish sometimes, and regard timespy, my ram is very uncompatible with my system have to clock 3600 and timing 28 28 28 29 85 too bad ram setting but not bad for superposition.
all tests +75 on core and +750 on memory, of aorus master bios. need to tweak my cpu and system ram overclocking.


https://www.3dmark.com/3dm/51856972




https://www.3dmark.com/spy/14650369


----------



## dr.Rafi

lowrider_05 said:


> It all comes down to luck in the silicon lottery and temperature my TUF OC can score the same but could do much more with a higher Powerlimit
> 
> https://www.3dmark.com/pr/383292
> View attachment 2462343


its also seams to me using 3600xt is helping you bit is clocking too high


----------



## cstkl1

dr.Rafi said:


> its also seams to me using 3600xt is helping you bit is clocking too high


cpu doesnt help much in PR.. cache and ram does slightly but its main all GPU as long you got no garbage running in the back ground..


----------



## bmgjet

Port Royal is mainly a power limit benchmark.
The higher your cards power limit, The higher you will score.
You can make more use of your given power limit by running the chip cooler since the cooler it is. The less power it will use for the same performance and the higher it will self boost.

CPU has minimal effect. And ram nothing as well since the whole test fits on the vram.
In the video of checking cpu bottle necks guy does benchmarks with a FX 8370 vs 9900K.

Score difference was 3 points extra for the I9 in port royal.
Other things like time spy then yes there is a massive difference.


----------



## cstkl1

bmgjet said:


> Port Royal is mainly a power limit benchmark.
> The higher your cards power limit, The higher you will score.
> You can make more use of your given power limit by running the chip cooler since the cooler it is. The less power it will use for the same performance and the higher it will self boost.
> 
> CPU has minimal effect. And ram nothing as well since the whole test fits on the vram.
> In the video of checking cpu bottle necks guy does benchmarks with a FX 8370 vs 9900K.
> 
> Score difference was 3 points extra for the I9 in port royal.
> Other things like time spy then yes there is a massive difference.


Also a bit buggy though.


----------



## Celeras

spajdr said:


> He actually replied that we are just theorycrafters without even having the actual card


He is indeed full of it, but to be fair nobody who replied in this thread saying that actually has the XC3 lol. So he's right about that.


----------



## bmgjet




----------



## DarknightOCR

dr.Rafi said:


> My best result, just to put in mind iam using water block on gpu only with fan and heatinks on memory and power stages for my ventus 3080, flashed with Aorus master rom, Iam losing around 300 points because iam using pci express generation 3 raiser cable and have to lock it to gen 3 in motherbaord bios setting otherwise its crashing the nvidia sound and make the graphic slugish sometimes, and regard timespy, my ram is very uncompatible with my system have to clock 3600 and timing 28 28 28 29 85 too bad ram setting but not bad for superposition.
> all tests +75 on core and +750 on memory, of aorus master bios. need to tweak my cpu and system ram overclocking.
> 
> 
> https://www.3dmark.com/3dm/51856972
> 
> 
> 
> 
> https://www.3dmark.com/spy/14650369



What your Boost clock on 3dmark and gaming?
And power limit


----------



## dr.Rafi

DarknightOCR said:


> What your Boost clock on 3dmark and gaming?
> And power limit


boost 3d mark 1995-2050
max power i seen so far 312watt 
using cpu water block only on gpu make it run so cool not like full graphic water block that suck heat from memory and other power component and new result with superposition + 105 on core and +750 memory


----------



## dr.Rafi

Some improvment 


https://www.3dmark.com/3dm/51865188




https://www.3dmark.com/pr/414882


----------



## DarknightOCR

Thanks 
But, in gaming what your clock AVG? 1995mhz? 2000mhz?


----------



## dr.Rafi

DarknightOCR said:


> Thanks
> But, in gaming what your clock AVG? 1995mhz? 2000mhz?


Not gaming yet but again further new records with cheap ventus +105 , +1000


----------



## dr.Rafi

Gear 5 4k max setting
dont worry for gpu2 its just 2080 ti doing nothing only help when do rendering with cuda proccessing.
Edit: forget to mension this all using pci express gen 3 riser and finally figured iam running only 8x instead of 16x which kill performance more tommorow will remove the 2080 ti and check how it will perform on 16 x.


----------



## dr.Rafi

with google chrome open which sucks some cpu power too.
gpu z for power consumption


----------



## VladimirAG

ASUS Strix 3080 OC... best score for now in Port Royal on extreme *air* cooling: https://www.3dmark.com/pr/406673


----------



## VPII

VladimirAG said:


> ASUS Strix 3080 OC... best score for now in Port Royal on extreme *air* cooling: https://www.3dmark.com/pr/406673


That power limit is helping a lot, clearly see it when looking at your max against average clock speed. Great result.


----------



## ShadowYuna

Finally my 3080 FE join my Nvidia Collection. At last no more F5


----------



## dr.Rafi

VladimirAG said:


> ASUS Strix 3080 OC... best score for now in Port Royal on extreme *air* cooling: https://www.3dmark.com/pr/406673


Great results


----------



## dr.Rafi

VPII said:


> That power limit is helping a lot, clearly see it when looking at your max against average clock speed. Great result.


Yes power limit and thermals.
Waiting for shunt resistors to be shipped will try either stretch my ventus muscles or kill it.


----------



## VladimirAG

VPII said:


> Great result.





dr.Rafi said:


> Great results


Wait for the waterblock and probably take the first positions and not only in this disciplin


----------



## dr.Rafi

VladimirAG said:


> Wait for the waterblock and probably take the first positions and not only in this disciplin


honstly try cpu water block check my thermals full graphic water block give you higher tempretures just use multiple heatsinks on memory and power stages with medium silent fan will give you the best results ,i know it wont look great but its great for benching.


----------



## Nyt Ryda

To anyone else with a MSI Gaming X Trio 3080, does setting a custom fan speed or fan curve in MSI Afterburner working correctly for you guys ?

On my card the fan % doesn't follow what I set. For example, if I set 60% it uses 46%. If I set 40%, it uses 30%. 

Using 4.63. Beta 2, and tried both the NVIDIA drivers released for Ampere. EVGA Precision doesn't change it at all either.


----------



## VladimirAG

dr.Rafi said:


> just use multiple heatsinks on memory and power stages with medium silent fan will give you the best results


Thanks for the hint, I'll try! So I have extreme watercooling system on which there will definitely be no problems with the heatsink for example: https://www.3dmark.com/pr/199036 or https://www.3dmark.com/fs/21129445


----------



## dr.Rafi

50+ more just by removing the second graphic card and make the 3080 run on 16x speed


----------



## Reinhardovich773

ShadowYuna said:


> Finally my 3080 FE join my Nvidia Collection. At last no more F5
> 
> View attachment 2462413
> 
> 
> View attachment 2462414
> 
> 
> View attachment 2462415


Hi there Mr Jensen! How's it going?


----------



## MikeGR7

dr.Rafi said:


> honstly try cpu water block check my thermals full graphic water block give you higher tempretures just use multiple heatsinks on memory and power stages with medium silent fan will give you the best results ,i know it wont look great but its great for benching.


Can you share some knowledge, which cpu block fits 3080s?

Tried my best to mod my G12 to fit a 360mm on my TRIO but nope.
Could not lower it below 60c because of crappy contact.


----------



## dr.Rafi

MikeGR7 said:


> Can you share some knowledge, which cpu block fits 3080s?
> 
> Tried my best to mod my G12 to fit a 360mm on my TRIO but nope.
> Could not lower it below 60c because of crappy contact.


Used classic ek waterblock copper , i took my card factory heatsink and i measure every detail of my card and drafted in cad software, with the gpu hole spacing and printed on sticker paper and tape thread m3 holes through the intel mounting bracket of the water bolck using the sticker paper as tamplet the i screw m3 long screws from the back of card using the original back tension bracke of the gpu to apply same pressure of the factory . is not very clear in photos because the 2080 ti is in the front blocking the view .
3080, 2080 ti, and 3900x all in one loop with single 420 radiator , all fully loaded in 3d max and vray rendering none pass 50 c degree.
last attachment shroud work in progress will have 2 blower fans in the back ( 3d printer blower fans 12 volt which is not very load to blow air on heat sinks and exust in the back of case, i will 3d print the shroud.


----------



## Mucho

Why are you running a 3080 and a 2080Ti in one system?


----------



## dr.Rafi

Mucho said:


> Why are you running a 3080 and a 2080Ti in one system?


for 3d rendering (vray )can use all the resources cpu, gpu you have and get you 16k resolution images in couple minutes. and some lighter games i only use 2080 ti and keep the 3080 resting and relax .
the car is not my drawing, only materials and lighting is mine.


----------



## asdkj1740

原價屋@酷！PC • 檢視主題 - 【開箱】全銅散熱模組、鼓風扇獨挑大樑！技嘉 RTX3090 TURBO 24GB顯示卡。







www.coolpc.com.tw




we now understand why gigabyte uses that strange adaptor on gaming and eagle models.
the indented design on turbo series increases airflow to the turbo fan. thats why gigabyte has to use that super low profile adaptor to make room for the turbo fan on top.
the pcb of ealge and gaming are the same as turbo one, and turbo pcb seems to be the original based design.







GeForce RTX™ 3090 TURBO 24G Key Features | Graphics Card - GIGABYTE Global


Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!




www.gigabyte.com




"The metal cover with the indented design and 80mm blower fan allows for larger airflow intake."


----------



## asdkj1740

DELLLL


----------



## GAN77

VladimirAG said:


> best score for now in Port Royal on extreme *air* cooling:


Happy for you, colleague! Excellent result!
We urgently need water and a frosty morning in Murmansk.


----------



## spajdr

This EAGLE OC card can take even +1300 on VRAM, too bad that this chip is on 2x8PIN version :-(
Gonna try Gainward nonGS tomorrow.


----------



## irakandjii

Hi folks I am new here, I like what I see in this forum.
I have an Aorus xtreme 3080 on order (I chose it because it had a shorter wait list than the Strix 3080). 

I recently read the Gigabyte reply to the capacitor issue, stating that the SP-Caps were not cheaper and they would meet Gigabytes stated performance guarantees. I noted that this did not affirm a commitment to overclocking. This makes me a bit nervous about Gigabyte as a supplier.

My questions, Does anyone have any experience with the xtreme yet or confirmed the capacitor array? 
Is Gigabyte a reliable vendor, or should I bite the bullet and stand in line for an EVGA or ROG Strix?


----------



## Dannyele

They actually exists!


----------



## TRX250EX

Do I have fastest 3080 with 2X 8 pin card or is it just Gigabyte Gaming OC cards just too good?

Edit: nvm saw Aorus 3080 owner hitting easily 12378 in Port Royal lol, others reporting 121XX only in Aorus


----------



## Zeakie

TRX250EX said:


> Do I have fastest 3080 with 2X 8 pin card or is it just Gigabyte Gaming OC cards just too good?
> 
> View attachment 2462494
> 
> 
> 
> View attachment 2462493
> 
> 
> 
> View attachment 2462491


Decent but I've gotten really close to that with my zotac.. silicon lottery winner I presume


----------



## parcher

*ZOTAC 3080 Trinity TUF OC Bios *

https://www.3dmark.com/pr/410436


----------



## Zeakie

parcher said:


> *ZOTAC 3080 Trinity TUF OC Bios *
> 
> https://www.3dmark.com/pr/410436


Whats your overclock/ curve ? Awesome score man


----------



## TRX250EX

parcher said:


> *ZOTAC 3080 Trinity TUF OC Bios *
> 
> https://www.3dmark.com/pr/410436


Thats really good, I wonder if TUF extra 5W bios helps? I doubt it will tbh, need to try again maybe with open case and colder ambient temp, I did it around 25C room temp


----------



## parcher

Zeakie said:


> Whats your overclock/ curve ? Awesome score man


the setting Only for benchmark..
+100 +500 Voltage 100% PL 109% Fan 90%


Inviato dal mio RNE-L21 utilizzando Tapatalk


----------



## Zeakie

parcher said:


> the setting Only for benchmark..
> +100 +500 Voltage 100% PL 109% Fan 90%
> 
> 
> Inviato dal mio RNE-L21 utilizzando Tapatalk


Great chip 👌 if zotac fixes their rgb issues we ain't got a bad card


----------



## lowrider_05

I finally managed to circumvent the powerlimit on my TUF OC and got my best Superposition score yet. Still on the stock air cooler


----------



## owntecx

lowrider_05 said:


> I finally managed to circumvent the powerlimit on my TUF OC and got my best Superposition score yet. Still on the stock air cooler
> 
> View attachment 2462513


Damn, nice score. how did u circumvent the power limit?


----------



## lowrider_05

owntecx said:


> Damn, nice score. how did u circumvent the power limit?


well, it´s not for the faint of heart.

1. shorted all! the shunts with pure solder (card showd 60Watts under load with stock Bios) but Powerlimit still kicked in.
2. Flashed the Strix 450Watt Bios (it showed NO LOAD and Voltage a few times as the limiting factor but still got Powerlimited a lot.
3. let the OC Scanner run and saved the Result from there the saved Preset can be used to set +150 +180 or whatever the card can handle and only Temperature and Voltage are playing a role now.


----------



## owntecx

lowrider_05 said:


> well, it´s not for the faint of heart.
> 
> 1. shorted all! the shunts with pure solder (card showd 60Watts under load with stock Bios) but Powerlimit still kicked in.
> 2. Flashed the Strix 450Watt Bios (it showed NO LOAD and Voltage a few times as the limiting factor but still got Powerlimited a lot.
> 3. let the OC Scanner run and saved the Result from there the saved Preset can be used to set +150 +180 or whatever the card can handle and only Temperature and Voltage are playing a role now.
> 
> View attachment 2462517


Well, thats some hardcore mod, not realy for the faint of the heart. Any estimate on the power draw on the wall?


----------



## lowrider_05

owntecx said:


> Well, thats some hardcore mod, not realy for the faint of the heart. Any estimate on the power draw on the wall?


Wattage is really hard to say, because i can only measure the power of the whole system and really rough estimate would be around 450 to 500 Watts for the card alone


----------



## lowrider_05

and a Port Royal run score jumped ~200 points since powermod
https://www.3dmark.com/pr/416338


----------



## FoamyV

hey, thoughts on undervolting? Is .85v - constant 1935mhz decent? or should I fiddle with it some more?


----------



## Mucho

FoamyV said:


> hey, thoughts on undervolting? Is .85v - constant 1935mhz decent? or should I fiddle with it some more?


I´m running [email protected], clock jumps between 1905 and 1935


----------



## cstkl1

screw crysis

CAN YOU RUN NFS HEAT!!











Spoiler


----------



## dr.Rafi

Last best PR score of 3080 ventus on aorus master bios, plugged the card directly to motherboard, running with no riser and pcie gen 4 now, and i dissabled half of the cpu 3900x in bios running only on 6 cores, so now is boosting higher and consistant.


https://www.3dmark.com/3dm/51897338


----------



## cstkl1

dr.Rafi said:


> Last best PR score of 3080 ventus on aorus master bios, plugged the card directly to motherboard, running with no riser and pcie gen 4 now, and i dissabled half of the cpu 3900x in bios running only on 6 cores, so now is boosting higher and consistant.
> 
> 
> https://www.3dmark.com/3dm/51897338


i would use PR to gauge your gpu oc btw...


----------



## dr.Rafi

cstkl1 said:


> i would use PR to gauge your gpu oc btw...





https://www.3dmark.com/pr/417359


----------



## Bennimaru

FoamyV said:


> hey, thoughts on undervolting? Is .85v - constant 1935mhz decent? or should I fiddle with it some more?


Undervolting is working nicely for me. i've got more curves, depending on which resolution/game intensity i do play. The most common i use has [email protected], the 2 i use for heavy games/bench are 2070 or [email protected]
Cant wait to finally get the waterblock to see if it can get any better
My best PR atm 


https://www.3dmark.com/pr/379314


----------



## cstkl1

dr.Rafi said:


> https://www.3dmark.com/pr/417359


i meant your self gauge dude. like whats the fall off point

so far for me sweet point is [email protected] with boosting to [email protected]


----------



## GTANY

EVGA FTW3 Ultra review : 




Disappointing : the TUF is better than the FTW3 in terms of noise/temperatures whereas the EVGA is more expensive.


----------



## VladimirAG

lowrider_05 said:


> and a Port Royal run score jumped ~200 points since powermod
> https://www.3dmark.com/pr/416338


The result is not bad, but try to minimize the number of frequency drops (six), like this with only two:










https://www.3dmark.com/pr/416592


----------



## VladimirAG

GAN77 said:


> Happy for you, colleague! Excellent result!
> We urgently need water and a frosty morning in Murmansk.


Minsk partizanen  ... yep, very need, but most of all I need in cooling of GPU, that is a small but solvable task.


----------



## martinhal

Has anyone put the Palit under water ?


----------



## lowrider_05

VladimirAG said:


> The result is not bad, but try to minimize the number of frequency drops (six), like this with only two:
> View attachment 2462601
> 
> 
> 
> https://www.3dmark.com/pr/416592


well my card is still on Air there is not much i can do, once i have my waterblock there will be more Headroom.

Your Card runs with ~38 °C and mine on Air runs with ~54 °C and there is the difference


----------



## DueAlian

dr.Rafi said:


> https://www.3dmark.com/pr/417359


So you have MSI RTX 3080 Ventus OC model? I assume you watercooled it judging by the temperature of 42 degrees? Can you inform how the stock cooler performed compared to watercooling?


----------



## VPII

martinhal said:


> Has anyone put the Palit under water ?


I all honesty, I don't think you'll gain much by putting the Palit under water. I have an Nzxt Kraken G12 with Corsair H110 which I could use to test but it is so power limited that it would not even help much.


----------



## Boham_CY

How are some of you getting your Palits to run at 2000+
Anything above 1920 sustained crashes for me


----------



## gemini002

Boham_CY said:


> How are some of you getting your Palits to run at 2000+
> Anything above 1920 sustained crashes for me


sounds like you lost the silicon lottery.


----------



## Nizzen

Boham_CY said:


> How are some of you getting your Palits to run at 2000+
> Anything above 1920 sustained crashes for me


It depends if you are comparing the clocks at the same load. 2100mhz is easy with light load, not so easy sustained in Port Royal without shuntmodding or VERY cold card


----------



## DueAlian

VPII said:


> I all honesty, I don't think you'll gain much by putting the Palit under water. I have an Nzxt Kraken G12 with Corsair H110 which I could use to test but it is so power limited that it would not even help much.


Can you please try that? I also have NZXT Kraken G12 with NZXT Kraken X62 currently on my RTX 2080TI and was wondering if I will be able to use them with my MSI RTX 3080 Ventus OC when it finally arrives this week.


----------



## VPII

DueAlian said:


> Can you please try that? I also have NZXT Kraken G12 with NZXT Kraken X62 currently on my RTX 2080TI and was wondering if I will be able to use them with my MSI RTX 3080 Ventus OC when it finally arrives this week.


You'll need to drill holes threw the Kraken G12 as the mountings won't line up properly. Here is a youtibe clip of a guy showing how he did it.


----------



## eeroo94

dr.Rafi said:


> Last best PR score of 3080 ventus on aorus master bios, plugged the card directly to motherboard, running with no riser and pcie gen 4 now, and i dissabled half of the cpu 3900x in bios running only on 6 cores, so now is boosting higher and consistant.
> 
> 
> https://www.3dmark.com/3dm/51897338


Did that bios help at all or is it still throttling at 320 watts?


----------



## VladimirAG

*ASUS ROG Strix GeForce RTX 3080 OC Edition* under extreme air cooling... further experiments will be under H2O 







https://www.3dmark.com/pr/418710


----------



## dr.Rafi

eeroo94 said:


> Did that bios help at all or is it still throttling at 320 watts?


Any 2 pin bios i tested so far (most of the vga bios collection on techpowerup site and all what i find here and some other websites)show max 312 watt in gpuz but the power supply software showing higher power consumption with some like Aorus master bios and nothing consistant some bios , give me higher score with lower clocks some crashs with less overclocking (i mean the total boost clock in gpuz ), some give me better scores with less power consumption in power supply software(i think some have a better curves than others),only 3 pin bioses give me higher reading power in gpuz which is not actual all the times, but aorus master is the best with lower power , and asus strix higher results with higher power it reachs 400 watt.


----------



## lapino

Hey guys, installed an RTX3080 (Asus TUF OC) a few days ago, and getting some random reboots (sometimes when gaming, sometimes when just doing nothing). Removed all drivers and reinstalled them, event viewer does not seem to show a lot except unexpected reboot. Could this be the psu? I have a Corsair RM750 (a few years old though). Never had any issues with my RTX2080 (non-super). So my guess it's the GPU. Any idea where to look?


----------



## lowrider_05

lapino said:


> Hey guys, installed an RTX3080 (Asus TUF OC) a few days ago, and getting some random reboots (sometimes when gaming, sometimes when just doing nothing). Removed all drivers and reinstalled them, event viewer does not seem to show a lot except unexpected reboot. Could this be the psu? I have a Corsair RM750 (a few years old though). Never had any issues with my RTX2080 (non-super). So my guess it's the GPU. Any idea where to look?


Reduce the Powerlimit to say 75% if it is stable at that i would suggest you try on another Power supply


----------



## martin28bln

Is there anybody out there who finally did a working shuntmod with ref.PCB of 3080 and 2x8Pin? When answer is yes - which shunts have to be shunted?


----------



## MikeGR7

VPII said:


> You'll need to drill holes threw the Kraken G12 as the mountings won't line up properly. Here is a youtibe clip of a guy showing how he did it.


Guys do not try that, i tried with a Tuf on 20/9 that i got my first and it was a fail imo...
With a 360mm AIO temps were on the 60s under load... 

It must be something with the contact pressure, it could be because of the rotated bracket or i suspect the DIE height is a wee bit different.

Either way, if you're willing to sacrifice a G12 to this experiment, be sure to use a copper shim to have any real chance of success.


----------



## kaydubbed

I have an MSI RTX 3080 VENTUS OC and I flashed the ASUS ROG STRIX vBios onto it and the card is getting a ton of power now [I've seen up to 470w] and I am pretty disappointed about my TimeSpy score [+200mhz/+200mhz mem] being only a few hundred points over my .850v/1920mhz undervolt on the MSI vBios. Here is my Port Royal score


The card was boosting to 2,205mhz during the Timespy test [snippit below] and drawing 115w more than the stock BIOS, and stayed pretty cool. I guess seeing is believing that GA102 hits a brick after 350w.


----------



## Mucho

martinhal said:


> Has anyone put the Palit under water ?


Yes, I`m running it with an Alphacool block


----------



## Nizzen

kaydubbed said:


> I have an MSI RTX 3080 VENTUS OC and I flashed the ASUS ROG STRIX vBios onto it and the card is getting a ton of power now [I've seen up to 470w] and I am pretty disappointed about my TimeSpy score [+200mhz/+200mhz mem] being only a few hundred points over my .850v/1920mhz undervolt on the MSI vBios.
> 
> 
> The card was boosting to 2,205mhz during the Timespy test [snippit below] and drawing 115w more than the stock BIOS, and stayed pretty cool. I guess seeing is believing that GA102 hits a brick after 350w.
> 
> View attachment 2462669


Sorry to say, but you're CPU score sux. Ryzen 2 is just too slow 
This benchmark is very cpu dependent and memory SPEED dependent 



https://www.3dmark.com/compare/spy/14684472/spy/14029645


----------



## kaydubbed

I'm not concerned with the CPU score. Just the GPU score.

I also added my Port Royale score to the post.


----------



## dr.Rafi

Keep going up with ventus with asus strix bios










https://www.3dmark.com/3dm/51924820


----------



## Nizzen

kaydubbed said:


> I'm not concerned with the CPU score. Just the GPU score.
> 
> I also added my Port Royale score to the post.


Looks like you are missing about 4-500points in port royale. Disable all g-sync "things", turn of all overlays. etc..


----------



## spajdr

dr.Rafi said:


> Keep going up with ventus with asus strix bios
> View attachment 2462689
> 
> 
> 
> https://www.3dmark.com/3dm/51924820


You using STRIX bios for 3x POWER card on 2x POWER card?


----------



## spajdr

This Gainward nonGS is not bad, need only 0.837V for 1905Mhz (tested in Quake II RTX), will try to go lower with voltage.


----------



## Orlovki

Mucho said:


> Yes, I`m running it with an Alphacool block


Imagine running such a Kackkarte


----------



## gemini002

spajdr said:


> You using STRIX bios for 3x POWER card on 2x POWER card?


must be the tuff as Strx does not work correctly on 2X8 card. I tried on tuf on worse clocks.


----------



## kaydubbed

Well I am using a Ventus OC and the Strix bios and I certainly see the power draw of the Strix. I'm not seeing the performance, though.


----------



## acoustic

The power draw is reporting incorrectly, which is why you "see" higher power draw, but performance doesn't go up. The card is trying to draw power from an 8pin connection that doesn't exist when you run a 3x 8pin BIOS on a 2x 8pin card. Typically it tends to hurt performance compared to a BIOS for a 2x 8pin card.


----------



## dr.Rafi

lapino said:


> Hey guys, installed an RTX3080 (Asus TUF OC) a few days ago, and getting some random reboots (sometimes when gaming, sometimes when just doing nothing). Removed all drivers and reinstalled them, event viewer does not seem to show a lot except unexpected reboot. Could this be the psu? I have a Corsair RM750 (a few years old though). Never had any issues with my RTX2080 (non-super). So my guess it's the GPU. Any idea where to look?


try to get minimum 1000watt Tuf is Tuf, even what nvidia calim is not correct.


----------



## dr.Rafi

spajdr said:


> You using STRIX bios for 3x POWER card on 2x POWER card?


yup working aowsome, and iam feeding the card with two 6 pin to one 8 pin adapter on both 8 pins of the card, so power supply is happy and be sure no power drop across cables.


----------



## dr.Rafi

kaydubbed said:


> Well I am using a Ventus OC and the Strix bios and I certainly see the power draw of the Strix. I'm not seeing the performance, though.


Its not actual power draw, check your wall socket and compare other bioses with strix, Aorus master bios is good and better than factory ventus bios, and some bioses clock higher or lower with same performance , my ventus bios give me same performance @1860 core clock of what aorus master @1920 give me, but aorus master can go up to 1955 on my card, while ventus max out @1865.
and i have cpu waterblock on gpu, lower temp =less power =higher more consistant clocks.


----------



## dr.Rafi

acoustic said:


> The power draw is reporting incorrectly, which is why you "see" higher power draw, but performance doesn't go up. The card is trying to draw power from an 8pin connection that doesn't exist when you run a 3x 8pin BIOS on a 2x 8pin card. Typically it tends to hurt performance compared to a BIOS for a 2x 8pin card.


it is not reading actual power but i think this is so good for ventus with stock was only 11640 


https://www.3dmark.com/pr/419175


----------



## Forkyyy

Hey guys, I have a 3080 Ventus on order and I wanted to ask if anyone knows a good BIOS to use on it just normally on air. I know the Strix BIOS works well but my concern is that it might require me to really crank the fans up to really make use of it and that's not quite what I'm looking for. Is it worth staying with stock BIOS or is there one with more moderate power limits that has better performance per watt on this card?


----------



## acoustic

I'd say the average temp of 41c has more to do with that score than anything else. That's incredibly low.

Have you ran the Ventus stock BIOS with those low temps?

For 2x 8pin cards I believe the Gigabyte 390w BIOS is the best?


----------



## dr.Rafi

kaydubbed said:


> I have an MSI RTX 3080 VENTUS OC and I flashed the ASUS ROG STRIX vBios onto it and the card is getting a ton of power now [I've seen up to 470w] and I am pretty disappointed about my TimeSpy score [+200mhz/+200mhz mem] being only a few hundred points over my .850v/1920mhz undervolt on the MSI vBios. Here is my Port Royal score
> 
> 
> The card was boosting to 2,205mhz during the Timespy test [snippit below] and drawing 115w more than the stock BIOS, and stayed pretty cool. I guess seeing is believing that GA102 hits a brick after 350w.
> 
> View attachment 2462669


I know everybody will say it has no effect but try dissable 1 cdd of your cpu make it run only on 8 cores will boost more during port royal, and dissable vs in nvidia panel, and exit any running software in back ground and end task any running software in taskmanger and dont run any osd software like afterburner during test, and btw that is not actual power reported flashing 3 pin bios on 2 pin card give you bit of pump pf power but not waht is reported in softwares. 
and finally watercool give you more performance too.


----------



## kaydubbed

I'm not disabling a CCD on my chipset to chase GPU air benchmarks lol

I think I'll put the TUF bios back on. I didn't give them enough of a shot when I wasn't seeing draw exceed 320w.


----------



## dr.Rafi

Forkyyy said:


> Hey guys, I have a 3080 Ventus on order and I wanted to ask if anyone knows a good BIOS to use on it just normally on air. I know the Strix BIOS works well but my concern is that it might require me to really crank the fans up to really make use of it and that's not quite what I'm looking for. Is it worth staying with stock BIOS or is there one with more moderate power limits that has better performance per watt on this card?


if you not squezzing the last few score points of benching stay with air and factory bios is the best performing for long time use, especially where you recieve your electricity bills, altrnative is aorus master bios give bit of performance kick.


----------



## kaydubbed

acoustic said:


> I'd say the average temp of 41c has more to do with that score than anything else. That's incredibly low.
> 
> Have you ran the Ventus stock BIOS with those low temps?
> 
> For 2x 8pin cards I believe the Gigabyte 390w BIOS is the best?


Yeah something isn't right. My temps are low with the original bios but I'd suspect to get higher temps. 41c is only 19c above ambient.

I'm going many bios to see what ends up getting me true >350w draws because the TUF bios sure didn't.


----------



## dr.Rafi

More progress with Ventus 


https://www.3dmark.com/pr/419534


----------



## dr.Rafi

acoustic said:


> I'd say the average temp of 41c has more to do with that score than anything else. That's incredibly low.
> 
> Have you ran the Ventus stock BIOS with those low temps?
> 
> For 2x 8pin cards I believe the Gigabyte 390w BIOS is the best?


Where did you get 390 watt gigabytebios?i think their max is 370 watt, iam waiting to get aorus extreme bios and see how it perform.


----------



## acoustic

The Gigabyte Gaming OC was 390, no? May have been mistaken. I have a 3x 8pin card so haven't paid super close attention.


----------



## Forkyyy

This is the latest one I found for Gigabyte uploaded on TPU: Gigabyte RTX 3080 VBIOS

Looks to be 370W, not sure if that translates to actual 370W when flashed over another GPU though.


----------



## Celeras

FoamyV said:


> hey, thoughts on undervolting? Is .85v - constant 1935mhz decent? or should I fiddle with it some more?


Seems about right. I settled on 1950mhz @ 0.862v


----------



## VPII

Look I thought it was actually normal which is why I did not make much of it, until a page back when someone posted a Hwinfo64 screenshot showing the GPU and I saw that his GPU power was stating like 32watt or there about. Now mine as you'll see in the screen grab is constantly sitting at 95 to 98watt and it does not change no matter what I do. Maybe I am missing something, but if someone can explain to me why my power draw is at that point when the card is doing nothing I'd really appreciate it.


----------



## kaydubbed

Celeras said:


> Seems about right. I settled on 1950mhz @ 0.862v


What's the power draw under load?


----------



## Celeras

kaydubbed said:


> What's the power draw under load?


Still gets as high as 330+W if I do something like Superposition @8K. Little lower with 1440p gaming.

But there is a HUGE difference in clock speed. With 0.862v I am pretty much locked at 1950mhz at all times. If I leave the curve alone, I'm quickly throttled down to 1800-1850mhz after a minute or two. Slight power save, but big performance boost.


----------



## VPII

Celeras said:


> Still gets as high as 330+W if I do something like Superposition @8K. Little lower with 1440p gaming.
> 
> But there is a HUGE difference in clock speed. With 0.862v I am pretty much locked at 1950mhz at all times. If I leave the curve alone, I'm quickly throttled down to 1800-1850mhz after a minute or two. Slight power save, but big performance boost.


Look I've been playing around with the vcurve and found a nice way during Time Spy or so to get the highest average clock speed. This was mainly visible while playing control at 1440P with everything maxed out including DLSS and RTX where my clock speed would never, and I mean never drop below 2000mhz mostly sitting around 2040mhz sometimes 2025mhz. The funny thing though is when I routed back to stock vcurve and increaed my clock speed by +105mhz I so my clockspeed sitting at 2130 sometimes dropping to 2100mhz and sometimes just a tad lower but it was perfectly stable at all times. However, when I run Doom Eternal with this +105mhz clock speeds still would be 1950 to 1935mhz with power draw a little higher.


----------



## Vapochilled

acoustic said:


> I'd say the average temp of 41c has more to do with that score than anything else. That's incredibly low.
> 
> Have you ran the Ventus stock BIOS with those low temps?
> 
> For 2x 8pin cards I believe the Gigabyte 390w BIOS is the best?


The highest gigabyte bios is 370w
Even aorus is 370w...no difference for gaming oc


----------



## dr.Rafi

VPII said:


> Look I thought it was actually normal which is why I did not make much of it, until a page back when someone posted a Hwinfo64 screenshot showing the GPU and I saw that his GPU power was stating like 32watt or there about. Now mine as you'll see in the screen grab is constantly sitting at 95 to 98watt and it does not change no matter what I do. Maybe I am missing something, but if someone can explain to me why my power draw is at that point when the card is doing nothing I'd really appreciate it.
> View attachment 2462716


In nvidia manager go to manage 3d setting and choose optimal power instead of maximum performance


----------



## VPII

dr.Rafi said:


> In nvidia manager go to manage 3d setting and choose optimal power instead of maximum performance


Hi thanks, but even after changing it it stills show 96 to 98watt used without doing anything.


----------



## VPII

dr.Rafi said:


> In nvidia manager go to manage 3d setting and choose optimal power instead of maximum performance


Interestingly, when I changed it back to Prefer maximum performance it dropped to 36watt for a couple seconds then shot up again to 96watt. He he he


----------



## dr.Rafi

VPII said:


> Hi thanks, but even after changing it it stills show 96 to 98watt used without doing anything.


reboot if does not work try to find if any software is using gpu in taske manger, for nvidia manger try restore defult in top right of screen and reboot.
if still not working , remove your graphic card driver and install again .


----------



## dr.Rafi

dr.Rafi said:


> reboot if does not work try to find if any software is using gpu in taske manger, for nvidia manger try restore defult in top right of screen and reboot.


and some cards with 2 bioses, the over clock mode bios, is boosted by defult and never go to idle clocks


----------



## DStealth

what an irony...flasing BIOSes like never before to get more than 320w loaded card in 3d... but still consuming 100w idle in 2d LOL


----------



## martin28bln

Just my experience with undervolting Gainward Phoenix 3080 under water. Tested stable with stresstest Timespy/Portroyal and Metro 5x Benchmarkloop maxed out with RTX @1080P.

[email protected]
[email protected]
[email protected]
[email protected]


----------



## ExarTarkin

Hi everyone!

Just received a 3080 Palit Gaming Pro OC and I think it have the same behaviour as VPII's card.

The card came with the "new 350W" bios, but even with the last afterburner beta, when the slide is pushed to 109% power, nothing change. The card still draw 320W/325W, with a few spikes to 335W.

Did someone found a solution? 
I hope it's only because afterburner have a limited support at this point...?


----------



## spajdr

martin28bln said:


> Just my experience with undervolting Gainward Phoenix 3080 under water. Tested stable with stresstest Timespy/Portroyal and Metro 5x Benchmarkloop maxed out with RTX @1080P.
> 
> [email protected]
> [email protected]
> [email protected]
> [email protected]


Hi Martin, what was the highest power usage you saw for example at 1950/875?


----------



## hemon

ExarTarkin said:


> Hi everyone!
> 
> Just received a 3080 Palit Gaming Pro OC and I think it have the same behaviour as VPII's card.
> 
> The card came with the "new 350W" bios, but even with the last afterburner beta, when the slide is pushed to 109% power, nothing change. The card still draw 320W/325W, with a few spikes to 335W.
> 
> Did someone found a solution?
> I hope it's only because afterburner have a limited support at this point...?


Same problem here with the TUF: I have max 350W where the bios can support 375W. I don't know how to reach 375W.


----------



## VPII

ExarTarkin said:


> Hi everyone!
> 
> Just received a 3080 Palit Gaming Pro OC and I think it have the same behaviour as VPII's card.
> 
> The card came with the "new 350W" bios, but even with the last afterburner beta, when the slide is pushed to 109% power, nothing change. The card still draw 320W/325W, with a few spikes to 335W.
> 
> Did someone found a solution?
> I hope it's only because afterburner have a limited support at this point...?


I loaded Thurnder Master the Palit OC and setup tool and found that when everything was set in Afterburner that the increased power limit did not show in Thunder Master. At first I thought maybe this was the reason my clocks kept dropping at 320watt even with the 9% power limit increase, but still the same issue unfortunately. I'll check now again after my restart.


----------



## kaydubbed

Has anyone gotten an MSI Ventus RTX 3080 working with a 370w bios? I've tried the TUF and Gigabyte bios, but they all seem to get stuck at 320w.


----------



## ExarTarkin

In fact it seems many reference/semi reference cards are stuck at 320w. I really hope it's a limitation with our current tools.

Tonight I'll try Thunder Master alone without Afterburner...


----------



## gemini002

hemon said:


> Same problem here with the TUF: I have max 350W where the bios can support 375W. I don't know how to reach 375W.


it does not only some times it spikes to 375. on mine I get 355-365w. I just got the trio in and I'm testing now. Gotta say it's binned better for sure hitng 2100 without bios flash yet. after I test OC and stock I will flash strix to it.


----------



## gemini002

ExarTarkin said:


> In fact it seems many reference/semi reference cards are stuck at 320w. I really hope it's a limitation with our current tools.
> 
> Tonight I'll try Thunder Master alone without Afterburner...


from what I have seen here. looks like most 320 cards are binned less. my tuff has no problems reading 355-365 and now this trio is reaching 360w. I think they are 320w for a reason probably cards that did not make the cut for OC versions


----------



## VPII

gemini002 said:


> from what I have seen here. looks like most 320 cards are binned less. my tuff has no problems reading 355-365 and now this trio is reaching 360w. I think they are 320w for a reason probably cards that did not make the cut for OC versions


Interesting thought. My Palit with its 320watt + 9% power limit can easily do 2130 to 2145 while playing Control at 1440P detail maxed out. I know the chip is good which is why I have not sold it yet as well as the fact that getting any other models is a long long wait.


----------



## DStealth

GB Auros for sure screeching the legs of Palit 3080 non OC ...very hi environment temps.. stock cooler still...


----------



## kaydubbed

Dang my Ventus isn't getting more than 100 PL with the Aorus BIOS. Weird.


----------



## VladimirAG

*ASUS ROG Strix GeForce RTX 3080 OC Edition* under extreme air cooling... further experiments will be under H2O


----------



## Alemancio

gemini002 said:


> from what I have seen here. looks like most 320 cards are binned less. my tuff has no problems reading 355-365 and now this trio is reaching 360w. I think they are 320w for a reason probably cards that did not make the cut for OC versions


AFAIK AIBs couldnt bin cards...


----------



## dr.Rafi

kaydubbed said:


> I'm not disabling a CCD on my chipset to chase GPU air benchmarks lol
> 
> I think I'll put the TUF bios back on. I didn't give them enough of a shot when I wasn't seeing draw exceed 320w.


I meant disable it to test only, is just what i do i try every possibility, learn more and then i can choose what best work for me for every day.


----------



## dr.Rafi

VPII said:


> Look I've been playing around with the vcurve and found a nice way during Time Spy or so to get the highest average clock speed. This was mainly visible while playing control at 1440P with everything maxed out including DLSS and RTX where my clock speed would never, and I mean never drop below 2000mhz mostly sitting around 2040mhz sometimes 2025mhz. The funny thing though is when I routed back to stock vcurve and increaed my clock speed by +105mhz I so my clockspeed sitting at 2130 sometimes dropping to 2100mhz and sometimes just a tad lower but it was perfectly stable at all times. However, when I run Doom Eternal with this +105mhz clock speeds still would be 1950 to 1935mhz with power draw a little higher.


Any game pull more power from the card and kick the power limit the clocks drops immediatly to keep the drain with in the power limit, try "need for speed heat" its power monster.


----------



## dr.Rafi

gemini002 said:


> from what I have seen here. looks like most 320 cards are binned less. my tuff has no problems reading 355-365 and now this trio is reaching 360w. I think they are 320w for a reason probably cards that did not make the cut for OC versions


Mine ventus was 320 watt but not any more , the reason is the quality and size of power stages, not the gpu chip itself, and thanks gosh, the power stages can deliver higher power ,only diffrence they run hotter, and you need to cool them more.


----------



## dr.Rafi

ExarTarkin said:


> Hi everyone!
> 
> Just received a 3080 Palit Gaming Pro OC and I think it have the same behaviour as VPII's card.
> 
> The card came with the "new 350W" bios, but even with the last afterburner beta, when the slide is pushed to 109% power, nothing change. The card still draw 320W/325W, with a few spikes to 335W.
> 
> Did someone found a solution?
> I hope it's only because afterburner have a limited support at this point...?


you may need to try need for speed heat and check.


----------



## kaydubbed

I don't understand why your BIOS work on your Ventus, and mine don't. I get the 109/117 Power Limits from the TUF and Aorus BIOS respectively, but I can't push the card to draw more than 320w.


----------



## Mucho

Orlovki said:


> Imagine running such a Kackkarte


Oh what an expert, Palit isn`t a s h i t card (Kackkarte)...the only s h i t I see are that kind of comments


----------



## Alemancio

EVGA Released 500W BIOS for their FTW3 Cards


----------



## acoustic

Lol holy ****


----------



## Alemancio

Alemancio said:


> EVGA Released 500W BIOS for their FTW3 Cards


I dont think it'll help a lot on air, remember that this card is done also for LN2. Cool to see EVGA trying though.


----------



## Edge0fsanity

Alemancio said:


> EVGA Released 500W BIOS for their FTW3 Cards


Will have to flash this tonight, just finished up the VF curve this morning for the 450w bios. Wasted my time i guess lol.


----------



## spajdr

Colleagues, what would be best bios for nonGS Gainward?


----------



## dev1ance

^
Probably Palit OC for reference design boards. 

Best I can get is 18,943. So close to 19k.


https://www.3dmark.com/spy/14698252


----------



## spajdr

@Deviance are you sure? it have same power limit settings as Gainward


----------



## Edge0fsanity

Alemancio said:


> I dont think it'll help a lot on air, remember that this card is done also for LN2. Cool to see EVGA trying though.


its probably worth 15-30mhz on air with VF curve tuned to never throttle. I'll know for sure soon.


----------



## acoustic

500watt BIOS is not compatible with 3080. It's for 3090 only. It's being misreported to also work with the 3080.

I get unsupported adapter on my 3080 when I try to run it. Quite honestly, on air cooling 450watt is already unmanageable for thermals.


----------



## dev1ance

spajdr said:


> @Deviance are you sure? it have same power limit settings as Gainward


Is it also a 350w power limit? If so, you won't find any more gains then. Could try the 370/380w stuff but I doubt you'll get a great result as the reference cards really prefer the reference card BIOs.


----------



## ps31791

is there any word on an NVFlash update to allow Founder's Edition cards to be flashed?


----------



## kaydubbed

I'm still wondering how the dude got more than 320w on his Ventus. I tried literally every 2x8 bios released in the last week today and I get increased Power Levels on my slider, but the card sure doesn't use it. It stays under 320w no matter what I throw at it.


----------



## bmgjet

Whats the power draw from the other parts.
GPU Chip and SRC have there own power limits.
So board power could be under the max power limit but it wont pull any more if those others are already on power limit. High leakage chips will hit the GPU Chip power limit really easily.
Im not sure if its 200 or 250W for 2 plug 3080s.


----------



## dr.Rafi

kaydubbed said:


> I don't understand why your BIOS work on your Ventus, and mine don't. I get the 109/117 Power Limits from the TUF and Aorus BIOS respectively, but I can't push the card to draw more than 320w.


Please check my replies in this thread and try to apply most of my instructions i mensioned if you follow up how i bring the card from 11600 to 12595 in PR.
and now working toward 13000


----------



## dr.Rafi

My joy in overclocking get something cheap in graphic card family and keep modding until bring it to the top end of the family or beyond, that what i call it fun, not buying the top card and then get top scores so easy with less modding.


----------



## TK421

dev1ance said:


> ^
> Probably Palit OC for reference design boards.
> 
> Best I can get is 18,943. So close to 19k.
> 
> 
> https://www.3dmark.com/spy/14698252





dev1ance said:


> ^
> Probably Palit OC for reference design boards.
> 
> Best I can get is 18,943. So close to 19k.
> 
> 
> https://www.3dmark.com/spy/14698252





spajdr said:


> @Deviance are you sure? it have same power limit settings as Gainward





kaydubbed said:


> I'm still wondering how the dude got more than 320w on his Ventus. I tried literally every 2x8 bios released in the last week today and I get increased Power Levels on my slider, but the card sure doesn't use it. It stays under 320w no matter what I throw at it.




I would like to know about this too, I have a Trinity OC from Zotac and it seems to have standard PCB?

Even the FTW3 from EVGA has the same voltage controller as this model, so it's worth to try and flash the 500w FTW3 bios on this card? The nvflash link has been updated to be able to accomplish this?

PCB pic


----------



## dev1ance

TK421 said:


> Even the FTW3 from EVGA has the same voltage controller as this model, so it's worth to try and flash the 500w FTW3 bios on this card? The nvflash link has been updated to be able to accomplish this?


No because we have 2x8 Pin and a completely different power phase design. If you read the thread, you're not getting anything with 3x8Pin BIOs on your card. Only certain BIOs are able to actually bring our GPU to 350w (i.e. Palit OC BIOs). Everything else is questionable and might take it to 340 but not the full 350 or disabling features (i.e losing DP/HDMI port)


----------



## TK421

dev1ance said:


> No because we have 2x8 Pin and a completely different power phase design. If you read the thread, you're not getting anything with 3x8Pin BIOs on your card. Only certain BIOs are able to actually bring our GPU to 350w (i.e. Palit OC BIOs). Everything else is questionable and might take it to 340 but not the full 350 or disabling features (i.e losing DP/HDMI port)


Here this person flash the reference 2x8pin EVGA XC3 with the EVGA FTW3 XOC vbios

I assume it can be done by flashing the stock FTW3 vbios, then using EVGA's updater tool to get the XOC vbios flashed


----------



## dev1ance

TK421 said:


> I assume it can be done by flashing the stock FTW3 vbios, then using EVGA's updater tool to get the XOC vbios flashed


Feel free to flash your card and report back on how it does. That guy talks a lot of bullshit and we discussed it already if you go through the thread.


----------



## TK421

dev1ance said:


> Feel free to flash your card and report back on how it does. That guy talks a lot of bullshit and we discussed it already if you go through the thread.


which page do I read from


----------



## TK421

Oh here's the explanation why the EVGA vbioses don't work


----------



## ExarTarkin

dr.Rafi said:


> you may need to try need for speed heat and check.


Sadly I don't own it. 
For now my "power angry" game for testing is Shadow of the Tomb Raider. 1st game to crash my 1080Ti with undervolted custum curve 

By the way the Palit OC tool don't work better then afterburner for me.


----------



## spajdr

bmgjet said:


> Whats the power draw from the other parts.
> GPU Chip and SRC have there own power limits.
> So board power could be under the max power limit but it wont pull any more if those others are already on power limit. High leakage chips will hit the GPU Chip power limit really easily.
> Im not sure if its 200 or 250W for 2 plug 3080s.


So 0.875mV for 1935-1950Mhz that hits 330-340W all the time in Quake II RTX could be marked as high leakage chip I suppose?


----------



## dr.Rafi

ExarTarkin said:


> Sadly I don't own it.
> For now my "power angry" game for testing is Shadow of the Tomb Raider. 1st game to crash my 1080Ti with undervolted custum curve
> 
> By the way the Palit OC tool don't work better then afterburner for me.











Buy Need For Speed: Heat ENG - Origin CD KEY cheap


Buy Need For Speed: Heat ENG - Origin CD KEY at the cheapest prices. Activate the CD Key on your Origin client. Save money and find the best deal.




www.gamivo.com




you can buy the key only on the link above and it is much cheaper than buy direct from EA


----------



## daveleebond

Can anyone recommend an easy to follow guide for a manual voltage curve? Is OC scanner within AB still worth trying on Ampere?


----------



## asdkj1740

https://www.bilibili.com/video/BV1ey4y1r7px


gigabyte 3080 eagle, aio cooling, 360mm aio deepcool
furmark 1440p 0xaa(off), 24c ambient temp, open test bench
43c @ 2000rpm @ 350w
(stock air cooling: 61c @ 1800rpm & 335W, 26c ambient, open test bench, https://www.bilibili.com/video/BV1NK4y1Y7Dk )


cant wait to see more aio mounting kit coming out like nzxt g12.


----------



## marc0053

I was fortunate to play with two different 3080 gpus, 1 Giga OC gaming and 1 ASUS Strix.
So far max port royal score with overclocks on air and 9900k on air stock settings:

OC gaming: 12294


https://www.3dmark.com/3dm/51866437?




ASUS Strix: 12832


https://www.3dmark.com/3dm/52003684?



On average Strix runs about 100mhz extra at max power of around 450W while OC gaming was mid 300s Watts.
As an aside both gpus doas about 100MH/s Ethereum hashrate mining between 220 and 250watts fairly easily


----------



## DarknightOCR

on the 3080 Ventus, I have tested almost all bios.
the one that gave me the most score was the 3080 Master 2x8pin

now i'm back to the original bios, and i see that the PCie only gives me 36W of maximum power.
doing the shunt mod only on the PCIe will solve the problem and increase the power of the PCIe to 75 / 100w ?
so in conjunction with the 150W of each 8pin it already gave a power of 375 / 400W in Msi Ventus


ideas and opinions


----------



## kaydubbed

DarknightOCR said:


> on the 3080 Ventus, I have tested almost all bios.
> the one that gave me the most score was the 3080 Master 2x8pin
> 
> now i'm back to the original bios, and i see that the PCie only gives me 36W of maximum power.
> doing the shunt mod only on the PCIe will solve the problem and increase the power of the PCIe to 75 / 100w ?
> so in conjunction with the 150W of each 8pin it already gave a power of 375 / 400W in Msi Ventus
> 
> 
> ideas and opinions


I have the Aorus Master bios on my Ventus and even though the card reports a 370w limit from the BIOS I can't get the card to go over 324w.

It's making me nuts.


----------



## dr.Rafi

Stretching the ventus with evga ftw ultra old bios 400watt


https://www.3dmark.com/3dm/52012550


----------



## SoldierRBT

Got my 3080 FTW3 Ultra today. Not a fan of the design. My first choice was Strix but it's impossible to get. Here's my result in Time Spy with the 450W beta BIOS.



https://www.3dmark.com/spy/14731637



Graphics Score: 20 058

It runs cool and high clocks but feels like 450W still limiting these cards. Gonna wait for the waterblock.


----------



## Adrian76

ExarTarkin said:


> Sadly I don't own it.
> For now my "power angry" game for testing is Shadow of the Tomb Raider. 1st game to crash my 1080Ti with undervolted custum curve
> 
> By the way the Palit OC tool don't work better then afterburner for me.


I thought I solved my undervolt on my 3080 after I passed SOTTR for over 3 hours until I ran the division 2 which crashed between 1-2 hours lol, Had to bump voltage some more.


----------



## dr.Rafi

SoldierRBT said:


> Got my 3080 FTW3 Ultra today. Not a fan of the design. My first choice was Strix but it's impossible to get. Here's my result in Time Spy with the 450W beta BIOS.
> 
> 
> 
> https://www.3dmark.com/spy/14731637
> 
> 
> 
> Graphics Score: 20 058
> 
> It runs cool and high clocks but feels like 450W still limiting these cards. Gonna wait for the waterblock.


Port royal more gpu dependent timspy usually pump by ram and cpu speed.


----------



## dr.Rafi

SoldierRBT said:


> Got my 3080 FTW3 Ultra today. Not a fan of the design. My first choice was Strix but it's impossible to get. Here's my result in Time Spy with the 450W beta BIOS.
> 
> 
> 
> https://www.3dmark.com/spy/14731637
> 
> 
> 
> Graphics Score: 20 058
> 
> It runs cool and high clocks but feels like 450W still limiting these cards. Gonna wait for the waterblock.


you can flash 500 watt check ithink page 89 or 88 for the link


----------



## changboy

Still waiting for my asus strix 3080, i order it the 1 october from memory express and they got some of them this week but i didnt recieve a mail for mine then i need keep waiting for it. I dont know if they got a lot of pré-order on this one. Its bad waiting like this after you pay the card.


----------



## djriful

Nizzen said:


> It depends if you are comparing the clocks at the same load. 2100mhz is easy with light load, not so easy sustained in Port Royal without shuntmodding or VERY cold card


Running on fan 3080 FE... https://www.3dmark.com/pr/394068











https://www.3dmark.com/spy/14525238













https://www.3dmark.com/spy/14525124













https://www.3dmark.com/nr/338558













https://www.3dmark.com/fs/23734207













https://www.3dmark.com/fs/23734125


----------



## DarknightOCR

dr.Rafi said:


> Stretching the ventus with evga ftw ultra old bios 400watt
> 
> 
> https://www.3dmark.com/3dm/52012550
> 
> 
> View attachment 2462963
> 
> View attachment 2462964


but with ftw bios the power limit shows wrong right? 
ventus only has 2x8pin, the power limit shown in gpuz / msi AB must be false. 
Do you notice a difference from the original or master bios, for that ftw bios? can you get a higher clock boost and a higher avg clock? 
with or without mod shunt? 
sorry for the questions, but it's to try to understand. because I can't get my tdp to pass 320 / 330w with any bios I tested the 450w bios, and although the msi AB showed 430w, the clock and performance was the same or worse


----------



## DStealth

dr.Rafi said:


> Stretching the ventus with evga ftw ultra old bios 400watt
> 
> 
> https://www.3dmark.com/3dm/52012550


This is a 2*8pin card with just a BIOS flash w/o HW shunts ?
If yes wow this is the highest result i've seen. How does the consumption goes with this BIOS i tested it a long time ago on my Palit 2*8pin card and wasn't able to pass 1830-1850 3dmark load frequnces as the load was totaly off while measuring 3th 8 pin consumption w/o having it.


----------



## BluePaint

SoldierRBT said:


> Got my 3080 FTW3 Ultra today. Not a fan of the design. My first choice was Strix but it's impossible to get. Here's my result in Time Spy with the 450W beta BIOS.
> 
> 
> https://www.3dmark.com/spy/14731637
> 
> 
> Graphics Score: 20 058
> It runs cool and high clocks but feels like 450W still limiting these cards. Gonna wait for the waterblock.


Great score and certainly a good chip. 2148Mhz average at 58 celsius sounds promising for water cooling.

I am close to 20.000 points with a MSI Trio + Strix bios, but the 3900X is holding the TS GPU score back a little (I actually get a little higher GPU score with my [email protected]). Looking forward to Zen 3 to test whether AMD can finally reach Intels level of performance in TS and other CPU dependent games/benchmarks.


----------



## zhrooms

Earlier today/Late last night Gigabyte pushed out a new BIOS for the Xtreme, as expected (Gigabyte reps said they were working on one last week), we just didn't know when it'd be out and what power limit, but here it is, 370W Default and 450W maximum power limit, matching both the FTW3 and Strix, has the same MSRP as Strix OC, but we don't know anything about the VRM design yet, what we do know is that it features a triple fan cooler taking up *3.5* slots, so cooling could be better than both FTW3 and Strix, time will tell, we need someone to simply test the cards.

AORUS GeForce RTX™ 3080 XTREME 10G Support | Graphics Card - GIGABYTE Global


----------



## asdkj1740

zhrooms said:


> Earlier today/Late last night Gigabyte pushed out a new BIOS for the Xtreme, as expected (Gigabyte reps said they were working on one last week), we just didn't know when it'd be out and what power limit, but here it is, 370W Default and 450W maximum power limit, matching both the FTW3 and Strix, has the same MSRP as Strix OC, but we don't know anything about the VRM design yet, what we do know is that it features a triple fan cooler taking up *3.5* slots, so cooling could be better than both FTW3 and Strix, time will tell, we need someone to simply test the cards.
> 
> AORUS GeForce RTX™ 3080 XTREME 10G Support | Graphics Card - GIGABYTE Global


do you have the link to that gigabyte rep reply?

and from what i remember aorus XTREME is $899, the most expensive one currently among 3080 avaliable.
for 16+4 vrm design, aorus master priced at $849 is crazy.
colorful advanced oc , asus tuf etc, they are priced well below 849 and having 16+4.


----------



## DueAlian

djriful said:


> Running on fan 3080 FE... https://www.3dmark.com/pr/394068
> View attachment 2462981
> 
> 
> 
> 
> https://www.3dmark.com/spy/14525238
> 
> 
> View attachment 2462982
> 
> 
> 
> 
> https://www.3dmark.com/spy/14525124
> 
> 
> View attachment 2462983
> 
> 
> 
> 
> https://www.3dmark.com/nr/338558
> 
> 
> View attachment 2462984
> 
> 
> 
> 
> https://www.3dmark.com/fs/23734207
> 
> 
> View attachment 2462985
> 
> 
> 
> 
> https://www.3dmark.com/fs/23734125
> 
> 
> View attachment 2462986


what kind of magical air is your gpu on? roids


----------



## zhrooms

asdkj1740 said:


> do you have the link to that gigabyte rep reply?
> 
> and from what i remember aorus XTREME is $899, the most expensive one currently among 3080 avaliable.
> for 16+4 vrm design, aorus master priced at $849 is crazy.
> colorful advanced oc , asus tuf etc, they are priced well below 849 and having 16+4.


No, it was a private conversation on discord.

Xtreme is $900 yes, but so is the Strix OC, and the FTW3 Ultra is $810, a lot cheaper for the same performance. So currently the FTW3 is the obvious card to get.

Master model is horrible yes, two power connector card at $849. VRM design doesn't really matter, even Gaming X Trio with 13+3 can safely run 520W. Power limit is what everyone should look at primarily.


----------



## asdkj1740

zhrooms said:


> No, it was a private conversation on discord.
> 
> Xtreme is $900 yes, but so is the Strix OC, and the FTW3 Ultra is $810, a lot cheaper for the same performance. So currently the FTW3 is the obvious card to get.
> 
> Master model is horrible yes, two power connector card at $849. VRM design doesn't really matter, even Gaming X Trio with 13+3 can safely run 520W. Power limit is what everyone should look at primarily.


strix oc on newegg us is 849 only.

i know vrm matters almost nothing, but the saved cost is not transferred to the selling price which is a huge shame.
if you check out the 3090 gigabyte turbo pcb, gigabyte cuts a lot of sp caps and replaced them with smd type solid caps on the pcb of eagle/gaming/master.

it is sad after gigabyte mattew is gone it is hard to reach out gigabyte rep.


----------



## TK421

zhrooms said:


> No, it was a private conversation on discord.
> 
> Xtreme is $900 yes, but so is the Strix OC, and the FTW3 Ultra is $810, a lot cheaper for the same performance. So currently the FTW3 is the obvious card to get.
> 
> Master model is horrible yes, two power connector card at $849. VRM design doesn't really matter, even Gaming X Trio with 13+3 can safely run 520W. Power limit is what everyone should look at primarily.


does the master have digital voltage controller like strix and FE?

most AIB use the UPI analog controller, while the more expensive option is the MPS digital one


surprising we still haven't seen a strix xoc bios? the 2080ti and 1080ti had one very early in its lifecycle


----------



## SoldierRBT

BluePaint said:


> Great score and certainly a good chip. 2148Mhz average at 58 celsius sounds promising for water cooling.
> 
> I am close to 20.000 points with a MSI Trio + Strix bios, but the 3900X is holding the TS GPU score back a little (I actually get a little higher GPU score with my [email protected]). Looking forward to Zen 3 to test whether AMD can finally reach Intels level of performance in TS and other CPU dependent games/benchmarks.


Thank you. 58C is very good but loud at 100% fan speed. It holds 2145-2160 through the run with some dips to 2115MHz when hitting power limit (450-453W). Starting the test it maintains 2190MHz but goes down when temperature reached 50C+.

What can I change on Windows to improve score? Do you guys touch Nvidia control panel? My result is just adjusting clocks in MSI Afterburner.


----------



## TK421

SoldierRBT said:


> Thank you. 58C is very good but loud at 100% fan speed. It holds 2145-2160 through the run with some dips to 2115MHz when hitting power limit (450-453W). Starting the test it maintains 2190MHz but goes down when temperature reached 50C+.
> 
> What can I change on Windows to improve score? Do you guys touch Nvidia control panel? My result is just adjusting clocks in MSI Afterburner.


there's a github script to strip windows of all bloat components to increase benchmark scores


----------



## zhrooms

asdkj1740 said:


> strix oc on newegg us is 849 only.
> 
> if you check out the 3090 gigabyte turbo pcb, gigabyte cuts a lot of sp caps and replaced them with smd type solid caps on the pcb of eagle/gaming/master.


I am aware, but only newegg, other american & european e-tailers charges $899 for the OC variant and $849 for the Non-OC. Newegg is an exception, not the actual MSRP.

They're not SP caps, and it doesn't matter, as long as there's at least one ceramic cluster it seems to be fine, very little testing has been done to conclude how it negatively affects overclocking, running polymer only.


TK421 said:


> does the master have digital voltage controller like strix and FE? most AIB use the UPI analog controller, while the more expensive option is the MPS digital one
> 
> surprising we still haven't seen a strix xoc bios? the 2080ti and 1080ti had one very early in its lifecycle


Looks like uPI, but can't be entirely sure without seeing high res pictures of it, but it's a casual 16+4 (max) PCB so does not make sense to use MPS controllers.

Not surprising, very unlikely there will ever be a XOC BIOS for 3080, as it's not the (actual) flagship card that every overclocker will run, Kinping is exclusive to 3090 as an example, anyone owning a 3080 or planning on getting one should forget to ever run "unlimited", unless you're willing to shunt mod it. It was also essentially dumb luck that we got the Strix 2080 Ti XOC BIOS, not happening again (it was uploaded on YouTube by Dancop).


----------



## TK421

zhrooms said:


> I am aware, but only newegg, other american & european e-tailers charges $899 for the OC variant and $849 for the Non-OC. Newegg is an exception, not the actual MSRP.
> 
> They're not SP caps, and it doesn't matter, as long as there's at least one ceramic cluster it seems to be fine, very little testing has been done to conclude how it negatively affects overclocking, running polymer only.
> 
> Looks like uPI, but can't be entirely sure without seeing high res pictures of it, but it's a casual 16+4 (max) PCB so does not make sense to use MPS controllers.
> 
> Not surprising, very unlikely there will ever be a XOC BIOS for 3080, as it's not the (actual) flagship card that every overclocker will run, Kinping is exclusive to 3090 as an example, anyone owning a 3080 or planning on getting one should forget to ever run "unlimited", unless you're willing to shunt mod it. It was also essentially dumb luck that we got the Strix 2080 Ti XOC BIOS, not happening again (it was uploaded on YouTube by Dancop).



yeah I guess now I will spend extra on power limit " paid DLC" for these cards

in the form of resistors...


----------



## dr.Rafi

BluePaint said:


> Great score and certainly a good chip. 2148Mhz average at 58 celsius sounds promising for water cooling.
> 
> I am close to 20.000 points with a MSI Trio + Strix bios, but the 3900X is holding the TS GPU score back a little (I actually get a little higher GPU score with my [email protected]). Looking forward to Zen 3 to test whether AMD can finally reach Intels level of performance in TS and other CPU dependent games/benchmarks.


try to disable half of the 3900x cores in bios and make it runon 6 cores only it will boost higher and get you higher scores


----------



## kaydubbed

TK421 said:


> there's a github script to strip windows of all bloat components to increase benchmark scores


Anyone have a link for this?


----------



## kaydubbed

dr.Rafi said:


> try to disable half of the 3900x cores in bios and make it runon 6 cores only it will boost higher and get you higher scores


Out of all the bios you tried on the Ventus, which work the best at consistent power? It would be helpful if you listed the bios hex/version code.


----------



## SoldierRBT

Time Spy GPU Score: 20 296
Place 7 GPU Score 


https://www.3dmark.com/spy/14740703



+185
+1200

It may do +190 but need more testing. After +1250 score drops.


----------



## asdkj1740

zhrooms said:


> I am aware, but only newegg, other american & european e-tailers charges $899 for the OC variant and $849 for the Non-OC. Newegg is an exception, not the actual MSRP.
> 
> They're not SP caps, and it doesn't matter, as long as there's at least one ceramic cluster it seems to be fine, very little testing has been done to conclude how it negatively affects overclocking, running polymer only.


i forgot where i saw the msrp list of asus models. there is a model called rog 3080 t11g, where t means TOP, this model is said to be 899.

i am talking about these input & output filtering caps.
this seems to be the base pcb design for gaming and eagle pcb.
gigabyte replaces those sp caps on the front of the gaming and eagle pcb and cut them off on back side, while still keep that strange pcie connector on them.





原價屋@酷！PC • 檢視主題 - 【開箱】全銅散熱模組、鼓風扇獨挑大樑！技嘉 RTX3090 TURBO 24GB顯示卡。







www.coolpc.com.tw


----------



## ncck

Anyone here an owner of a 3080 and also have geforce experience installed? I don't want to install the software - was just wondering if someone could activate my legions key and send it to me. And yes I do own a 3080, here's a screencap for proof. Or is it not possible to do? If it is I'd really appreciate it, also if this is against the rules then apologies and I'll edit/remove the post

Edit: I also tried to activate on nvidia website but it requires GFE to be installed


----------



## BluePaint

SoldierRBT said:


> What can I change on Windows to improve score? Do you guys touch Nvidia control panel? My result is just adjusting clocks in MSI Afterburner.


In Nvidia control panel, besides the standard stuff (power management max, texture filtering high performance, gsync off, ...), setting pre-rendered frames to 4 can actually give some more points in some 3dmark benches. In Windows, I make sure that there is nothing running in background by looking at taskmanager.
Otherwise, with your card, if u want to climb the ladder some more, try to give your PC access to some fresh morning air since temps are probably the biggest limit with your card. Seems to be a really good chip if u haven't even optimized the curve yet and just added core. +1200 is also a very good VRAM overclock. it could be that the optimum is slightly below (each TS run always has a slightly different result even with same settings), just by judging from other 3080 VRAM OCs in 3dmark top rankings.



dr.Rafi said:


> try to disable half of the 3900x cores in bios and make it runon 6 cores only it will boost higher and get you higher scores


yes, thanks. did that already. gives about 100Mhz more but not enough for 20000. Haven't tried whether maximizing the best CCX and downclocking the others helps. But I think I will just wait for Zen 3 cause that was actually the reason I got the 3900x. Just so that I can play around with Ryzen platform before Zen 3. My [email protected] gives better scores and can also run @5.2 but I don't have any special RAM for it, just 3200CL15. It didn't like the 2x16GB b-die kit i got for the Zen build which works great there.
In TS Extreme the card does better due to less CPU dependence:
TS-Extreme 10034 GPU


----------



## kaydubbed

Here is the issue I'm having with the MSI Ventus. It doesn't matter what Bios I use [here the XC3 EVGA.RTX3080.10240.200902_1.rom], I still can't get above 320w and if I change the power limit through the bios, it just sets the 107% power limit to 320w! Has anyone seen this before or have a suggestion? I am thinking about cutting my losses and returning the card to MC.


----------



## dr.Rafi

asdkj1740 said:


> do you have the link to that gigabyte rep reply?
> 
> and from what i remember aorus XTREME is $899, the most expensive one currently among 3080 avaliable.
> for 16+4 vrm design, aorus master priced at $849 is crazy.
> colorful advanced oc , asus tuf etc, they are priced well below 849 and having 16+4.


Gigabyte Aorus master, have a beefy cooler and alot of rgb , dual bios and lcd screen .


----------



## asdkj1740

dr.Rafi said:


> Gigabyte Aorus master, have a beefy cooler and alot of rgb , dual bios and lcd screen .


almost 4 slot and the cooling performace is nothing special.
i would rather to have some fan clips or a mounting frame for standard 120mm*25mm fans to be mounted on top of the heatsink.

dual bios, nothing special at $849 at all.

lcd screen, yeah i was hoping a lower pricing for not having that lcd screen on master.

how about input and output filtering caps choice on master? not great at all.

btw for 8pin*2 only i highly doubt gigabyte dares to officially release 450w bios for master. and i guess the xtreme has the same vrm as master.


----------



## VPII

Id like to ask a question, Palit was sold today and I can get an Eagle Oc next week. Would flasing the bios with the Gaming Oc help for power. I mean ftom 340 to 370watt.

Sent from my SM-G960F using Tapatalk


----------



## spajdr

VPII : I had some success, highest usage with EAGLE OC with flashed Gaming OC bios was 362W.


----------



## TK421

kaydubbed said:


> Anyone have a link for this?





ncck said:


> Anyone here an owner of a 3080 and also have geforce experience installed? I don't want to install the software - was just wondering if someone could activate my legions key and send it to me. And yes I do own a 3080, here's a screencap for proof. Or is it not possible to do? If it is I'd really appreciate it, also if this is against the rules then apologies and I'll edit/remove the post
> 
> Edit: I also tried to activate on nvidia website but it requires GFE to be installed
> 
> View attachment 2463023












GitHub - Sycnex/Windows10Debloater: Script to remove Windows 10 bloatware.


Script to remove Windows 10 bloatware. Contribute to Sycnex/Windows10Debloater development by creating an account on GitHub.




github.com







https://github.com/Moyster/BaiGfe


----------



## gemini002

djriful said:


> Running on fan 3080 FE... https://www.3dmark.com/pr/394068
> View attachment 2462981
> 
> 
> 
> 
> https://www.3dmark.com/spy/14525238
> 
> 
> View attachment 2462982
> 
> 
> 
> 
> https://www.3dmark.com/spy/14525124
> 
> 
> View attachment 2462983
> 
> 
> 
> 
> https://www.3dmark.com/nr/338558
> 
> 
> View attachment 2462984
> 
> 
> 
> 
> https://www.3dmark.com/fs/23734207
> 
> 
> View attachment 2462985
> 
> 
> 
> 
> https://www.3dmark.com/fs/23734125
> 
> 
> View attachment 2462986


Ya me to I get 2145MHz on air https://www.3dmark.com/3dm/52046910?


----------



## SoldierRBT

BluePaint said:


> In Nvidia control panel, besides the standard stuff (power management max, texture filtering high performance, gsync off, ...), setting pre-rendered frames to 4 can actually give some more points in some 3dmark benches. In Windows, I make sure that there is nothing running in background by looking at taskmanager.
> Otherwise, with your card, if u want to climb the ladder some more, try to give your PC access to some fresh morning air since temps are probably the biggest limit with your card. Seems to be a really good chip if u haven't even optimized the curve yet and just added core. +1200 is also a very good VRAM overclock. it could be that the optimum is slightly below (each TS run always has a slightly different result even with same settings), just by judging from other 3080 VRAM OCs in 3dmark top rankings.


Thank you. I tweaked Nvidia control settings and optimized Windows and got a few extra points. Got 3rd place in the leaderboard with air cooling.


https://www.3dmark.com/spy/14747556


Graphics Score 20 311

I'm still using +185/+1200. +190 seems to crash all the time, maybe it needs cooler temps.

EDIT: Got Portal Royal and manage to get +205/+1200.


https://www.3dmark.com/pr/429551


Graphics Score 13164


----------



## rankftw

I've just got a Palit Gaming Pro. What bios can be flashed to this to increase the power limit?


----------



## spajdr

rankftw said:


> I've just got a Palit Gaming Pro. What bios can be flashed to this to increase the power limit?


Try Gigabyte Gaming OC bios


----------



## gemini002

here is my results with MSI tri flashed with Strix bios. First trio is a good card on it's own. The strix bios takes it to another level.


----------



## ssgwright

SoldierRBT said:


> Thank you. I tweaked Nvidia control settings and optimized Windows and got a few extra points. Got 3rd place in the leaderboard with air cooling.
> 
> 
> https://www.3dmark.com/spy/14747556
> 
> 
> Graphics Score 20 311
> 
> I'm still using +185/+1200. +190 seems to crash all the time, maybe it needs cooler temps.
> 
> EDIT: Got Portal Royal and manage to get +205/+1200.
> 
> 
> https://www.3dmark.com/pr/429551
> 
> 
> Graphics Score 13164


that pr link isn't working but I saw you on leaderboard... nice score!


----------



## SoldierRBT

ssgwright said:


> that pr link isn't working but I saw you on leaderboard... nice score!


Thanks. Sorry about that. When I get higher scores, I delete the previous runs. Here's my latest score:


https://www.3dmark.com/pr/430629


Graphics Score 13 205

I made a video of my settings. The card can maintain 2205MHz on air on a open bench (21C ambient) max temp 61C. Here's the video: (score is low at the end because I was recording, I got 13205 with this settings)





The highest voltage I can use in Port Royal before hitting power limits is 1.056v (447W). I locked that voltage and added +225 on the core +1200 on the memory. Been tweaking 1.043v to lower temps and it could hold 2190-2205MHz no problem but I need to test more.


----------



## Soulpatch

I'm waiting for the EVGA 3080 as well. Picked up an EK waterblock already. Pre-ordered to fit the evga and it looks fantastic. If you are going to go to the evga, check out the ek waterblocks. I'm still running a 1070 with an ek block on it and division 2 runs great. Can't wait to get the upgraded card. Been waiting quite a while for it.


----------



## BluePaint

SoldierRBT said:


> Thanks. Sorry about that. When I get higher scores, I delete the previous runs. Here's my latest score:
> 
> 
> https://www.3dmark.com/pr/430629
> 
> 
> Graphics Score 13 205
> 
> The highest voltage I can use in Port Royal before hitting power limits is 1.056v (447W). I locked that voltage and added +225 on the core +1200 on the memory. Been tweaking 1.043v to lower temps and it could hold 2190-2205MHz no problem but I need to test more.


Congrats for your efforts! You are actually No. 1 on TS and PR on air as far as I can tell.


----------



## ssgwright

pretty impressive on air to run 2200 mhz with an average temp of 58c...


----------



## doom26464

How are you guys cooling these things on air at 450w?? 

My msi gaming x trio at 360w does run cool but once I throw some encoding at temps go up 7-8c no way I could see doing 450w on top while encoding unless it was under water.

Still tempted to flash a strix bios on too it anyways just for benchmarking but thats all its good for, cant daily drive on it.


----------



## specopsFI

Hi all,

I've been gathering some data on the Palit Gamerock models. They look like technically solid cards. To the best of my knowledge, both the OC and non-OC models have 21 VRM phases (18+3, 50 amps each) and both have a max power limit of 400W. The stock power limit is different, 340W for non-OC and 370W for the OC. MSRP isn't quite clear, but in Europe the non-OC model has been one of the cheapest 3x8pin models. Has been available in the UK for 749£ and at least in Finland for 849€, both very comparable to most entry level 3080's at the moment. If the looks don't scare you away, these might be good candidates for OC. The cooler is capable and build quality pretty solid, especially the VRM and memory keep very cool.


----------



## zhrooms

asdkj1740 said:


> i forgot where i saw the msrp list of asus models. there is a model called rog 3080 t11g, where t means TOP, this model is said to be 899.
> 
> i am talking about these input & output filtering caps.
> this seems to be the base pcb design for gaming and eagle pcb.
> gigabyte replaces those sp caps on the front of the gaming and eagle pcb and cut them off on back side, while still keep that strange pcie connector on them.


The *Top* model is just an EEC registration, does not mean they will ever see the light of day, pretend you never saw it. Also, there has absolutely not been any pricing reveals ($899) of a card that doesn't exist. MSRP of the Strix is $849 and Strix OC is $899, same as AORUS Master $849 and AORUS Xtreme $899, both feature extra HDMI connectors, and higher boost, over the FTW3 as an example of why they cost more. The Strix also features a more impressive PCB & VRM (we don't know the specifics of AORUS Xtreme yet so can't comment on that). Can't comment about cooler performance either, but the Strix looks incredible, regular model for $849 is clearly a better choice than FTW3 Ultra at $809 ($50 more).

The power stage capacitors does not matter for performance or overclocking, as far as I know, since *even the cheapest ones are overkill*, never seen any examples of it being demonstrated on turing or ampere, very difficult to test, so very likely never will. At the end of the day, what really matters is the artificial *power limit*.

Like, the best example I can give is that the cheapest 2080 Ti, NVIDIA Reference PCB 13+3 VRM, can *safely* run with *zero cooling* (literally *no* heatsink or airflow) on VRAM or VRM, just need to cool the GPU die. And with a water block to keep the VRM temps in check when overclocking, the VRM has *no issues doing 700W. *These cards are extremely over-engineered, running them 2080 Ti/3080 at 250W/320W TDP is an absolute joke, they can do twice that on a decent air cooler, *safely*.


----------



## VULC

Can anyone confirm if the Strix OC ROM is compatible with the Colorful Vulcan OC?


----------



## gemini002

VULC said:


> Can anyone confirm if the Strix OC ROM is compatible with the Colorful Vulcan OC?


if not 3 X8 nope


----------



## VULC

It's 3 X 8 pin. They didn't even list one of the best cards in the OP.


----------



## Nizzen

VULC said:


> It's 3 X 8 pin. They didn't even list one of the best cards in the OP.


Is Colorful 3080/3090 China only products?


----------



## VULC

-


----------



## VULC

Nizzen said:


> Is Colorful 3080/3090 China only products?


They ship globally mostly in all Asia. I'm in Australia and they have exclusive retailers here. Second biggest AIB after Gainward Palit.


----------



## dr.Rafi

zhrooms said:


> The *Top* model is just an EEC registration, does not mean they will ever see the light of day, pretend you never saw it. Also, there has absolutely not been any pricing reveals ($899) of a card that doesn't exist. MSRP of the Strix is $849 and Strix OC is $899, same as AORUS Master $849 and AORUS Xtreme $899, both feature extra HDMI connectors, and higher boost, over the FTW3 as an example of why they cost more. The Strix also features a more impressive PCB & VRM (we don't know the specifics of AORUS Xtreme yet so can't comment on that). Can't comment about cooler performance either, but the Strix looks incredible, regular model for $849 is clearly a better choice than FTW3 Ultra at $809 ($50 more).
> 
> The power stage capacitors does not matter for performance or overclocking, as far as I know, since *even the cheapest ones are overkill*, never seen any examples of it being demonstrated on turing or ampere, very difficult to test, so very likely never will. At the end of the day, what really matters is the artificial *power limit*.
> 
> Like, the best example I can give is that the cheapest 2080 Ti, NVIDIA Reference PCB 13+3 VRM, can *safely* run with *zero cooling* (literally *no* heatsink or airflow) on VRAM or VRM, just need to cool the GPU die. And with a water block to keep the VRM temps in check when overclocking, the VRM has *no issues doing 700W. *These cards are extremely over-engineered, running them 2080 Ti/3080 at 250W/320W TDP is an absolute joke, they can do twice that on a decent air cooler, *safely*.


Is not joke its only they targeting to keep most gamers electricity bills down using 600 watt for graphic card only for whole day gaming will bloat your bills so fast. and not every one and most who use these card have no knowledge in these things all they do plug and play, dont take it personally iam like you not a plug and play guy.


----------



## bmgjet

dr.Rafi said:


> Is not joke its only they targeting to keep most gamers electricity bills down using 600 watt for graphic card only for whole day gaming will bloat your bills so fast. and not every one and most who use these card have no knowledge in these things all they do plug and play, dont take it personally iam like you not a plug and play guy.


If you can afford a 3080 you can afford to pay for power lol.
Over here its 33 cents per kwh cant imagin any other countries that charge this much.
So 19.8 cents (0.11 usd) per hour of gaming at 600W. Say you game all day for 8 hours, Thats $1.58 ($1.05 USD).


----------



## GTANY

zhrooms said:


> The *Top* model is just an EEC registration, does not mean they will ever see the light of day, pretend you never saw it. Also, there has absolutely not been any pricing reveals ($899) of a card that doesn't exist. MSRP of the Strix is $849 and Strix OC is $899, same as AORUS Master $849 and AORUS Xtreme $899, both feature extra HDMI connectors, and higher boost, over the FTW3 as an example of why they cost more. The Strix also features a more impressive PCB & VRM (we don't know the specifics of AORUS Xtreme yet so can't comment on that). Can't comment about cooler performance either, but the Strix looks incredible, regular model for $849 is clearly a better choice than FTW3 Ultra at $809 ($50 more).
> 
> The power stage capacitors does not matter for performance or overclocking, as far as I know, since *even the cheapest ones are overkill*, never seen any examples of it being demonstrated on turing or ampere, very difficult to test, so very likely never will. At the end of the day, what really matters is the artificial *power limit*.
> 
> Like, the best example I can give is that the cheapest 2080 Ti, NVIDIA Reference PCB 13+3 VRM, can *safely* run with *zero cooling* (literally *no* heatsink or airflow) on VRAM or VRM, just need to cool the GPU die. And with a water block to keep the VRM temps in check when overclocking, the VRM has *no issues doing 700W. *These cards are extremely over-engineered, running them 2080 Ti/3080 at 250W/320W TDP is an absolute joke, they can do twice that on a decent air cooler, *safely*.


I may contemplate buying the ASUS RTX 3090 TUF, shunt + voltage mod (with an EVC2 voltage controller) and watercool the GPU. The left VRM and RAM would be cooled by the default radiators which are excellent, the right VRMs would be covered by a 5 mm thick copper plate. RAM, left and right VRM and backplate would be cooled by 120 mm fans. The GPU would be cooled by a modded CPU waterblock and 2 big Watercool MO-RA3 Pro 420 radiators.

Shunts : 2x8 pins + PCI-e 5 mohms shunts to double the total power
Voltage : +100 mV max

I am a little bit worried by the long-term reliability if I use the graphic card shunt and voltage modded for games during 2 years : the power stage has 16 CPU phases, each 50 A max which is 800 W max at 1 V. A 3090 shunt and voltage modded may go beyond this limit which is risky, even with the GPU watercooled. And extracting 150 W max from the motherboard via the PCI-express connector does not make me fell very confident. Your opinion ?


----------



## Nizzen

VULC said:


> They ship globally mostly in all Asia. I'm in Australia and they have exclusive retailers here. Second biggest AIB after Gainward Palit.


Haven't seen too many Colorful products here in Europe. We have one product of Colorful 3080/3090 here, and it's Barrow waterblocks.


----------



## William Clement

Got my ftw3 finnaly but whatever I do on air i am just stuck between 2010-2070 with sometimes a peak to 2085 for a little bit. When I try to go higher it just crashes to the desktop. 
Guess I have a average chip? Or is it one of the worst? 
I see people here with 2.1+ but is everyone getting that? Running the beta bios offcourse powerlimit isn't really reached.

My mem goes +1250 so that seems fine.

I will put it under water eventuuely so hopes it pushes a little bit higher then hope 2.1.
Not that it seems to mater much fps anyway in games. Just a thing you know  wow seems fine running at 2070


----------



## Edge0fsanity

William Clement said:


> Got my ftw3 finnaly but whatever I do on air i am just stuck between 2010-2070 with sometimes a peak to 2085 for a little bit. When I try to go higher it just crashes to the desktop.
> Guess I have a average chip? Or is it one of the worst?
> I see people here with 2.1+ but is everyone getting that? Running the beta bios offcourse powerlimit isn't really reached.
> 
> My mem goes +1250 so that seems fine.
> 
> I will put it under water eventuuely so hopes it pushes a little bit higher then hope 2.1.
> Not that it seems to mater much fps anyway in games. Just a thing you know  wow seems fine running at 2070


Your card sounds just like my ftw3. Peaks at 2085 on air but realistically runs closer to 2010 due to power throttling with a simple +70 OC on it. I suspect these results are much closer to the average for these cards, all these people above 2100mhz on air have golden chips and are the exception. I've played around with the VF curve quite a bit and found 1.025v @ 2070mhz to be the sweet spot for games and port royal benchmark. Never power throttles with the 450w bios and runs at 2040mhz once fully heatsoaked around 66-70C. 

Also for memory you're likely not gaining anything from running it at +1250. I've found diminishing returns kicking in after +700. This memory has error checking so you need to repeatedly run port royal at different frequencies to find where the benefit is lost.


----------



## acoustic

Flash the 450watt BIOS to the card if you haven't already.

I agree that the core numbers some people report seem impossible. Even if my card is apparently a terrible bin, it still seems impossible. There's no way some of them are game stable under sustained loads; as soon as the card hits over 70-75c, the stability for core clocks drops pretty hard.


----------



## GTANY

William Clement said:


> Got my ftw3 finnaly but whatever I do on air i am just stuck between 2010-2070 with sometimes a peak to 2085 for a little bit. When I try to go higher it just crashes to the desktop.
> Guess I have a average chip? Or is it one of the worst?
> I see people here with 2.1+ but is everyone getting that? Running the beta bios offcourse powerlimit isn't really reached.
> 
> My mem goes +1250 so that seems fine.
> 
> I will put it under water eventuuely so hopes it pushes a little bit higher then hope 2.1.
> Not that it seems to mater much fps anyway in games. Just a thing you know  wow seems fine running at 2070


What is your resolution ? What benchmarks/games ? In 4k, such GPU frequency is fine. Moreover, many tell +2100 Mhz but it is only a peak frequency.


----------



## William Clement

acoustic said:


> Flash the 450watt BIOS to the card if you haven't already.
> I agree that the core numbers some people report seem impossible. Even if my card is apparently a terrible bin, it still seems impossible. There's no way some of them are game stable under sustained loads; as soon as the card hits over 70-75c, the stability for core clocks drops pretty hard.


Yeah I did it's a bit more stable on the "higher" clocks sins the original only has 400w.



GTANY said:


> What is your resolution ? What benchmarks/games ? In 4k, such GPU frequency is fine. Moreover, many tell +2100 Mhz but it is only a peak frequency.


I play 3440x1440p atm. My benches are from whatever 3d mark does on there default runs.
Sins eu don't have the beta fw fix for my c9 yet I am not testing 4k yet. I guess the clocks will go down a bit more on that res.

I just want to know if i have a dud gpu or it's okish. I think it is but not sure that's why I am kinda querious what people here are getting/playing there games at 




Edge0fsanity said:


> Your card sounds just like my ftw3. Peaks at 2085 on air but realistically runs closer to 2010 due to power throttling with a simple +70 OC on it. I suspect these results are much closer to the average for these cards, all these people above 2100mhz on air have golden chips and are the exception. I've played around with the VF curve quite a bit and found 1.025v @ 2070mhz to be the sweet spot for games and port royal benchmark. Never power throttles with the 450w bios and runs at 2040mhz once fully heatsoaked around 66-70C.
> 
> Also for memory you're likely not gaining anything from running it at +1250. I've found diminishing returns kicking in after +700. This memory has error checking so you need to repeatedly run port royal at different frequencies to find where the benefit is lost.


Hmm ii will try a custom voltage curve a bit.

And the mem I know when I do port royal above 1250 then I get less. And if I do less mhz on it I also get less.


----------



## daveleebond

William Clement said:


> Yeah I did it's a bit more stable on the "higher" clocks sins the original only has 400w.
> 
> 
> I play 3440x1440p atm. My benches are from whatever 3d mark does on there default runs.
> Sins eu don't have the beta fw fix for my c9 yet I am not testing 4k yet. I guess the clocks will go down a bit more on that res.
> 
> I just want to know if i have a dud gpu or it's okish. I think it is but not sure that's why I am kinda querious what people here are getting/playing there games at
> 
> 
> 
> Hmm ii will try a custom voltage curve a bit.
> 
> And the mem I know when I do port royal above 1250 then I get less. And if I do less mhz on it I also get less.


I have a C9 and updated in engineer mode weeks ago. Really easy to do and playing halo mcc [email protected] is glorious


----------



## William Clement

daveleebond said:


> I have a C9 and updated in engineer mode weeks ago. Really easy to do and playing halo mcc [email protected] is glorious


Yeah I can imagine but i live in the eu and lg support here is like update? What update? we no have a update yet?

Anyway I think my card is a average clocker on the gpu if i see this list from oc3d so guess I don't have that messed up of a card.
Ill wait for ekwb block to release and hope I get a bit more out of it. And who knows maybe nvidia brings out something else in these 90 days I have to stepup to something else or bring down the 3090 price so much that maybe that is a nice option  for now I just live with my current clocks.


----------



## SoldierRBT

Just uploaded a new video playing Battlefield V. RTX 3080 holds 2175MHz on air. Fans 100% speed


----------



## William Clement

SoldierRBT said:


> Just uploaded a new video playing Battlefield V. RTX 3080 holds 2175MHz on air. Fans 100% speed


Dude thas insane you got a golden sample it seems!


----------



## senna89

*Anyone passed from RTX 2080TI or 1080Ti custom to 3080 FE ?*
Any CPU temp increase feedback ? Specially with air cooler.


----------



## Nizzen

SoldierRBT said:


> Just uploaded a new video playing Battlefield V. RTX 3080 holds 2175MHz on air. Fans 100% speed


Looks nice


----------



## spajdr

SoldierRBT said:


> Just uploaded a new video playing Battlefield V. RTX 3080 holds 2175MHz on air. Fans 100% speed


How is clock holding up in Quake II RTX?


----------



## SoldierRBT

Nizzen said:


> Looks nice


Thanks. Here's another video holding 2220MHz at 1.081v voltage locked. Stock cooler with good airflow.





Hopefully it can do better with a waterblock.



spajdr said:


> How is clock holding up in Quake II RTX?


Haven't tried it yet but it holds 2160MHz at 1.012v on Time Spy (hitting power limits) and 2205MHz 1.056v in Port Royal.


----------



## Nizzen

SoldierRBT said:


> Thanks. Here's another video holding 2220MHz at 1.081v voltage locked. Stock cooler with good airflow.
> 
> 
> 
> 
> 
> Hopefully it can do better with a waterblock.
> 
> 
> 
> Haven't tried it yet but it holds 2160MHz at 1.012v on Time Spy (hitting power limits) and 2205MHz 1.056v in Port Royal.


PS: Turn off motion blur. It makes life better


----------



## acoustic

Is that shunt modded?


----------



## SoldierRBT

Nope, just with the 450W beta BIOS and 100% fans speed.


----------



## acoustic

Ah. Yeah, low power draw means the card can open up with the higher voltages. Definitely helping a ton with the temps too. Cool!


----------



## Vapochilled

SoldierRBT said:


> Thanks. Here's another video holding 2220MHz at 1.081v voltage locked. Stock cooler with good airflow.
> 
> 
> 
> 
> 
> Hopefully it can do better with a waterblock.
> 
> 
> 
> Haven't tried it yet but it holds 2160MHz at 1.012v on Time Spy (hitting power limits) and 2205MHz 1.056v in Port Royal.


Bench again at 4k and let me know the new results


----------



## SoldierRBT

The closest resolution to 4K I can select through DSR is 4865x2036. It still holds 2190-2205MHz at 1.043-1.056v load voltage depending on the map.







EDIT:
Ran Quake 2 RTX at 440W+ and it holds 2160MHz with just 1.012v (voltage locked to avoid Power limit). Also removed Memory OC to get extra wattage for Core OC. I hope EVGA releases a 500W BIOS for 3080 too.


----------



## spajdr

Dear god, what a golden chip if it holds so good even in Quake II RTX


----------



## dr.Rafi

SoldierRBT said:


> Thanks. Here's another video holding 2220MHz at 1.081v voltage locked. Stock cooler with good airflow.
> 
> 
> 
> 
> 
> Hopefully it can do better with a waterblock.
> 
> 
> 
> Haven't tried it yet but it holds 2160MHz at 1.012v on Time Spy (hitting power limits) and 2205MHz 1.056v in Port Royal.





William Clement said:


> Yeah I can imagine but i live in the eu and lg support here is like update? What update? we no have a update yet?
> 
> Anyway I think my card is a average clocker on the gpu if i see this list from oc3d so guess I don't have that messed up of a card.
> Ill wait for ekwb block to release and hope I get a bit more out of it. And who knows maybe nvidia brings out something else in these 90 days I have to stepup to something else or bring down the 3090 price so much that maybe that is a nice option  for now I just live with my current clocks.
> 
> View attachment 2463259


Any card with 3 x 8 pin can hold 2100 or even 2200 + overclock easy if cooled under 40 or 50 but with any 2 x 8pin is imppossible because it need to draw 400 or more power, the only way arounded is shunt mod to pull more power , it is nothing to do with silicon quality most of 3080 nividia chip quality is minus plus 50 mhz different acheivable , the whole story lay on power stages and the capacitor behind the nvidia chip.


----------



## dr.Rafi

acoustic said:


> Is that shunt modded?


Any card with 3 x 8 pin can hold that overclock easy if cooled under 40 or 50 but with any 2 x 8pin is imppossible because it need to draw 400 or more power, the only way arounded is shunt mod to pull more power , it is nothing to do with silicon quality most of 3080 nividia chip quality is minus plus 50 mhz different acheivable , the whole story lay on power stages and the capacitor behind the nvidia chip


----------



## William Clement

dr.Rafi said:


> Any card with 3 x 8 pin can hold 2100 or even 2200 + overclock easy if cooled under 40 or 50 but with any 2 x 8pin is imppossible because it need to draw 400 or more power, the only way arounded is shunt mod to pull more power , it is nothing to do with silicon quality most of 3080 nividia chip quality is minus plus 50 mhz different acheivable , the whole story lay on power stages and the capacitor behind the nvidia chip.


Let hope so from what I see in gpu-z I constantly run at these percap (VRel, VOp) limits I don't see pwr or tmp so maybe my gpu is just voltage starved? If so i am screwed i guess?


----------



## dr.Rafi

William Clement said:


> Let hope so from what I see in gpu-z I constantly run at these percap (VRel, VOp) limits I don't see pwr or tmp so maybe my gpu is just voltage starved? If so i am screwed i guess?


I shunt mod my ventus with 0.005 ohms on four near power connector and the pcie one flashed with 400 old evga ftwu bios giving the total graphic draw of 450 to 500, because the 450 one from evga is going crazy reach 650 watt draw for the same performance i played quack 2 rtx demo and adjust the curve to 1012 mv i am getting 2130 fixed stable play, waiting for 22uf x50 mlcc cap from USA , mine has 5 spcap and 10 mlcc from factory ,going to replace 1 sp cap in middle with 10 mlcc and see how it will go .


----------



## dr.Rafi

William Clement said:


> Let hope so from what I see in gpu-z I constantly run at these percap (VRel, VOp) limits I don't see pwr or tmp so maybe my gpu is just voltage starved? If so i am screwed i guess?


the gpu is watercooled with cpu cooler , and 120 fan blowing from the back to the front of the card the ram and power stages are running so cool even not warm to touch , i kept the back plate with the thermal pads in place and all the ram chips have heatsink on them that photo was before i applied to the last 2 chips .


https://www.3dmark.com/spy/14749694


I have this one with the best graphic score and clocks but the cpu throtle annd drope the final score https://www.3dmark.com/spy/14797065


----------



## GTANY

dr.Rafi said:


> the gpu is watercooled with cpu cooler , and 120 fan blowing from the back to the front of the card the ram and power stages are running so cool even not warm to touch , i kept the back plate with the thermal pads in place and all the ram chips have heatsink on them that photo was before i applied to the last 2 chips .
> 
> 
> https://www.3dmark.com/spy/14749694
> 
> 
> View attachment 2463442
> 
> View attachment 2463450


I see no radiator on 1 RAM chip. What is its temperature with an IR thermometer ?


----------



## William Clement

dr.Rafi said:


> the gpu is watercooled with cpu cooler , and 120 fan blowing from the back to the front of the card the ram and power stages are running so cool even not warm to touch , i kept the back plate with the thermal pads in place and all the ram chips have heatsink on them that photo was before i applied to the last 2 chips .
> 
> 
> https://www.3dmark.com/spy/14749694


thnx for the info!
What where you able to run before you went to watercooling? I am a bit qurious what kinda uplift I should be expecting.

And does anyone knows if these VRel/VOp messages I get are normal? I would suspect a air cooler to hit tmp instead but maybe I am just seeing ghost. Been a while for me that I oc'ed gpu's.



dr.Rafi said:


> You can go back in this clup 5 pages to check and follow my progress
> 
> https://www.3dmark.com/spy/14530026 that was my best on air though not sure i think was also on water but sure before shunts , i always go for water cooling the gpu, especially i have everything i need, my card can push higher, stay cooler and less noise.


seems water to me 31 degrees.
Clock on that one seems to be @ stock
Anyway guess I just have to do with what I got and hope for a 60mhz uplift or something on water then it should be ok for me. Not like it matters much in games anyway


----------



## dr.Rafi

GTANY said:


> I see no radiator on 1 RAM chip. What is its temperature with an IR thermometer ?


Please read the text carfully


----------



## asdkj1740

dr.Rafi said:


> the gpu is watercooled with cpu cooler , and 120 fan blowing from the back to the front of the card the ram and power stages are running so cool even not warm to touch , i kept the back plate with the thermal pads in place and all the ram chips have heatsink on them that photo was before i applied to the last 2 chips .
> 
> 
> https://www.3dmark.com/spy/14749694
> 
> 
> I have this one with the best graphic score and clocks but the cpu throtle annd drope the final score https://www.3dmark.com/spy/14797065
> View attachment 2463442
> 
> View attachment 2463450


hey is your ventus cooler having heat dierct touch heatpipes?
i saw a picture recently the ventus is now changed to copper base.


----------



## dr.Rafi

William Clement said:


> thnx for the info!
> What where you able to run before you went to watercooling? I am a bit qurious what kinda uplift I should be expecting.
> 
> And does anyone knows if these VRel/VOp messages I get are normal? I would suspect a air cooler to hit tmp instead but maybe I am just seeing ghost. Been a while for me that I oc'ed gpu's.


You can go back in this clup 5 pages to check and follow my progress


William Clement said:


> thnx for the info!
> What where you able to run before you went to watercooling? I am a bit qurious what kinda uplift I should be expecting.
> 
> And does anyone knows if these VRel/VOp messages I get are normal? I would suspect a air cooler to hit tmp instead but maybe I am just seeing ghost. Been a while for me that I oc'ed gpu's.


https://www.3dmark.com/spy/14530026 that was my best on air though not sure i think was also on water but sure before shunts , i always go for water cooling the gpu, especially i have everything i need, my card can push higher, stay cooler and less noise.


----------



## sblantipodi

it's fun to see that Ubisoft says that you need a 2080 to play Valhalla at 4K 30FPS and you need a Xbox Seris X to play it 4K, 60FPS.


----------



## dr.Rafi

William Clement said:


> thnx for the info!
> What where you able to run before you went to watercooling? I am a bit qurious what kinda uplift I should be expecting.
> 
> And does anyone knows if these VRel/VOp messages I get are normal? I would suspect a air cooler to hit tmp instead but maybe I am just seeing ghost. Been a while for me that I oc'ed gpu's.
> 
> 
> seems water to me 31 degrees.
> Clock on that one seems to be @ stock
> Anyway guess I just have to do with what I got and hope for a 60mhz uplift or something on water then it should be ok for me. Not like it matters much in games anyway


was not stock clocks but the gigabyte bios on my ventus was reading the clocks wrong in 3dmark for unknown reason and as i said i think was on water but brfor shunts, if you check the average is too low no way you can get that score with these clcocks , in time spy as i remember was boosting before shunt on water around 1950 but drop to even 1780 in certain scenes during the bench buut now can do 2115 to 2130 most of the time .


----------



## dr.Rafi

And does anyone knows if these VRel/VOp messages I get are normal? I would suspect a air cooler to hit tmp instead but maybe I am just seeing ghost. Been a while for me that I oc'ed gpu's.


Not sure but mine with shunt mode and no heatsink or fan on power stages after 10 to 20 minutes game play the card still throutle , the power chips are thermally protected and drop the power and show Lim power and the clocks start droping when i put the heatsinks and fan never drop again.


----------



## Edge0fsanity

Has anyone done before/after review of overclocks on air vs water with a full cover block? Curious to see the real difference this makes without a shunt mod happening in the process.


----------



## DStealth

Edge0fsanity said:


> Has anyone done before/after review of overclocks on air vs water with a full cover block? Curious to see the real difference this makes without a shunt mod happening in the process.


No, probably 1-2 max 3 - 13mhz straps difference temeperature dependant.


----------



## DarknightOCR

Ya. I curious to .


----------



## SoldierRBT

dr.Rafi said:


> Any card with 3 x 8 pin can hold 2100 or even 2200 + overclock easy if cooled under 40 or 50 but with any 2 x 8pin is imppossible because it need to draw 400 or more power, the only way arounded is shunt mod to pull more power , it is nothing to do with silicon quality most of 3080 nividia chip quality is minus plus 50 mhz different acheivable , the whole story lay on power stages and the capacitor behind the nvidia chip.


I don’t think that’s true. I think silicon quality is everything no matter how limited the GPU is in terms of power. Higher TDP only allows the video card to sustain a higher voltage point on heavy loads.

Let’s say you’re limited to 0.900v hitting power limits on a 4K test and the extra TDP from a different BIOS let the video card boosts to 0.950v. It’ll still be limited to what it can achieve at 0.950v.

My card that does 2160MHz 1.012v locked on a 440W+ test can do the same on a game that draws only 350-370W. It won’t do higher clocks because the wattage draw is lower.

My previous 2080 Ti would maxed out at 2040MHz and it was a 3x 8pin card with 400W BIOS and that was the best I got out of 4 cards.


----------



## William Clement

just weird I am now playing around with afterburner a bit.
I now installed afterburner set a curve voltage sins that's easier in it with that ctrl+L to give it a flat curve 1 strap higher then my precision would give in heaven at 1.0560.

And guess what no perfcap voltage stuff in gpu-z just giving idle. Wth??

Screw this i am now getting 2070-2100 @ 1.081  guess I will be able to get my goal with a waterblock i guess 🤣
Yep found it. Setting a fixed voltage/clocks seems to help me. When I use just the +mhz stuff in presicion and afterburner it crashes way sooner. Has to be something with the voltage jumping or something?


----------



## dr.Rafi

SoldierRBT said:


> I don’t think that’s true. I think silicon quality is everything no matter how limited the GPU is in terms of power. Higher TDP only allows the video card to sustain a higher voltage point on heavy loads.
> 
> Let’s say you’re limited to 0.900v hitting power limits on a 4K test and the extra TDP from a different BIOS let the video card boosts to 0.950v. It’ll still be limited to what it can achieve at 0.950v.
> 
> My card that does 2160MHz 1.012v locked on a 440W+ test can do the same on a game that draws only 350-370W. It won’t do higher clocks because the wattage draw is lower.
> 
> My previous 2080 Ti would maxed out at 2040MHz and it was a 3x 8pin card with 400W BIOS and that was the best I got out of 4 cards.





SoldierRBT said:


> I don’t think that’s true. I think silicon quality is everything no matter how limited the GPU is in terms of power. Higher TDP only allows the video card to sustain a higher voltage point on heavy loads.
> 
> Let’s say you’re limited to 0.900v hitting power limits on a 4K test and the extra TDP from a different BIOS let the video card boosts to 0.950v. It’ll still be limited to what it can achieve at 0.950v.
> 
> My card that does 2160MHz 1.012v locked on a 440W+ test can do the same on a game that draws only 350-370W. It won’t do higher clocks because the wattage draw is lower.
> 
> My previous 2080 Ti would maxed out at 2040MHz and it was a 3x 8pin card with 400W BIOS and that was the best I got out of 4 cards.


I did not say silicon quality does not matter even i mention plus minus 50 mhz but so far reading every comment in this forum and studying every result from all of you who shared results and watching dozen of youtubes vedios about 3080 i can see Asus strix and tuf with modded bios , and most evgas with new bios can aceive the best results because they already have a good power components and more mlcc's on back of the card , and simply on air when your power stages run cooler the gpu run cool as well because they sharing the same heatsink also to mension they use better cooling design.


----------



## PhoenixMDA

SoldierRBT said:


> I don’t think that’s true. I think silicon quality is everything no matter how limited the GPU is in terms of power. Higher TDP only allows the video card to sustain a higher voltage point on heavy loads.
> 
> Let’s say you’re limited to 0.900v hitting power limits on a 4K test and the extra TDP from a different BIOS let the video card boosts to 0.950v. It’ll still be limited to what it can achieve at 0.950v.
> 
> My card that does 2160MHz 1.012v locked on a 440W+ test can do the same on a game that draws only 350-370W. It won’t do higher clocks because the wattage draw is lower.
> 
> My previous 2080 Ti would maxed out at 2040MHz and it was a 3x 8pin card with 400W BIOS and that was the best I got out of 4 cards.


Nice Card, in HWL we have tested some Card´s with ATI Tool at 1,081-1,1V the highest Boost, it´s without heavy load.
The beste Card´s reach up to 2340Mhz and Mem +1300 - +1400, the bad ones arround 2200Mhz.Under heavy Load a bad Chip it not more than a Heating Plate^^.
You have a good Card, i think around 2300Mhz with ATI Tool max. boost.
This is from [email protected] by my own card is 2295Mhz finished and Mem +1300- +1400.Perhaps any one goes better^^So you can easy test how good is your Chip


----------



## SoldierRBT

Nice. Can you teach me how to use ATI Tool to find the highest possible core clock? I think my card can only reach 1.081v. It could be a BIOS limitation


----------



## PhoenixMDA

You must only raise up the core Voltage +100%.Look at the picture there you can see it.
Then you can set the point at 1,1V.

The Chips on StrixOC, Tuf,TufOc are not really selected.the best Chip was on a Tuf, my is a TufOc and one with a StrixOc has a really bad Chip^^


----------



## William Clement

PhoenixMDA said:


> Nice Card, in HWL we have tested some Card´s with ATI Tool at 1,081-1,1V the highest Boost, it´s without heavy load.
> The beste Card´s reach up to 2340Mhz and Mem +1300 - +1400, the bad ones arround 2200Mhz.Under heavy Load a bad Chip it not more than a Heating Plate^^.
> You have a good Card, i think around 2300Mhz with ATI Tool max. boost.
> This is from [email protected] by my own card is 2295Mhz finished and Mem +1300- +1400.Perhaps any one goes better^^So you can easy test how good is your Chip


Hmm interesting thanks for the tip.

2190 and errors sadface 2175 seems ok. Bleh should i gamble to get a better one or just leave it be and take a gamble next gen again.
Are there more "bad" cards then good ones I wonder. Say if one out of 5 is good rest bad then it won't really matter to take a gamble.


----------



## SoldierRBT

PhoenixMDA said:


> You must only raise up the core Voltage +100%.Look at the picture there you can see it.
> Then you can set the point at 1,1V.
> 
> The Chips on StrixOC, Tuf,TufOc are not really selected.the best Chip was on a Tuf, my is a TufOc and one with a StrixOc has a really bad Chip^^


Like this? Took the screenshot at 32C to match your photo.


----------



## PhoenixMDA

By testing its important that you dont push by +mhz on the curve, in this case you reach 2-3 Steps less.
Only the point on 1,1V and raise up to xxxx Mhz.
And yes it seems ist that the range beetween the Chip quality more than 150Mhz is and a FTW or Strix dont must have a good Chip.
@SoldierRBT
Very nice, you chip is one of the bestyou do 45Mhz more than i.I can hold by 31degrease for a short time 2310Mhz.Your Chip need under load ca. 25mV less than my Chip by same frequency.


----------



## dr.Rafi

PhoenixMDA said:


> Nice Card, in HWL we have tested some Card´s with ATI Tool at 1,081-1,1V the highest Boost, it´s without heavy load.
> The beste Card´s reach up to 2340Mhz and Mem +1300 - +1400, the bad ones arround 2200Mhz.Under heavy Load a bad Chip it not more than a Heating Plate^^.
> You have a good Card, i think around 2300Mhz with ATI Tool max. boost.
> This is from [email protected] by my own card is 2295Mhz finished and Mem +1300- +1400.Perhaps any one goes better^^So you can easy test how good is your Chip
> View attachment 2463517


----------



## PhoenixMDA

If my Chip is under 32Grad i can hold 2310Mhz, it´s start by 2325 and go fast down to 2310 but there is the limit of my Chip.









In 24/7 in hard Case Game like Hunt Showndown i´m a little bit unter my PT of ca. 355W by 0,8625V and have 1950Mhz stable.
Under Water i think 426W PT is ok for that i have bought 25mOhm for 20% more PT.For 24/7 is this a good choice i think.
5 or 8mOhm is for my like open Drain and i dont want to blow up my Card.
i Hope it´s enough for arround 2100+- by heavy Load Game.


----------



## ZealotKi11er

Just hit 1 month waiting for my order in Canada. This is not looking good. It would be sad if I can get 6800 XT before RTX 3080.


----------



## dr.Rafi

PhoenixMDA said:


> You must only raise up the core Voltage +100%.Look at the picture there you can see it.
> Then you can set the point at 1,1V.
> 
> The Chips on StrixOC, Tuf,TufOc are not really selected.the best Chip was on a Tuf, my is a TufOc and one with a StrixOc has a really bad Chip^^


I understand the silicon quality but i can see on 3080 because nvidia used samsung 8 nm there is very low variance in chip qualities, and you cant judge a bad gpu strix chip quality unless you can swap the chips on the same pcb ,in cpus market you can figure so easy if there is very high defference between chips by swapping cpus on the same motherboard but on graphic card is very hard, there is thousand of componenats on the pcb and the gpu chips are not getting same qualities voltage and power signal on different pcb even if they are the same brand and both strix for example , one tiny leak on pcb, or one leaky mosfet or power chip can change the whole story, even if both pass the stock clock quality tests.
In 2080 ti yes there was big diffrence i had 3 2080ti's each one perform very differently.


----------



## PhoenixMDA

@dr.Rafi 
I think the most chip´s are not bad, but it give´s also really bad chip´s, the range i think is +150Mhz, with the same capacitor vrm etc.
The Problem is if you need more VCore for Frequency XXXX, the Power that you need is much higher, so your PT is faster reached.
An inefficient VRM cost you also Power, an bad VRM cost you also Mhz.
With the Asus StrixOC the have one with not a so good chip, he has done with +Mhz on [email protected],081V that are ca. 30Mhz less than with a "point on mhz" 1,1V.
An then 450W by a bad Chip is not enough.


----------



## dr.Rafi

SoldierRBT said:


> Like this? Took the screenshot at 32C to match your photo.
> 
> View attachment 2463522


even the gpu didnot start ,ican also take a snapshoot @2400 hz with immediate crash


----------



## dr.Rafi

PhoenixMDA said:


> @dr.Rafi
> I think the most chip´s are not bad, but it give´s also really bad chip´s, the range i think is +150Mhz, with the same capacitor vrm etc.
> The Problem is if you need more VCore for Frequency XXXX, the Power that you need is much higher, so your PT is faster reached.
> An inefficient VRM cost you also Power, an bad VRM cost you also Mhz.
> With the Asus StrixOC the have one with not a so good chip, he has done with +Mhz on [email protected],081V that are ca. 30Mhz less than with a "point on mhz" 1,1V.
> An then 450W by a bad Chip is not enough.
> View attachment 2463555





PhoenixMDA said:


> @dr.Rafi
> I think the most chip´s are not bad, but it give´s also really bad chip´s, the range i think is +150Mhz, with the same capacitor vrm etc.
> The Problem is if you need more VCore for Frequency XXXX, the Power that you need is much higher, so your PT is faster reached.
> An inefficient VRM cost you also Power, an bad VRM cost you also Mhz.
> With the Asus StrixOC the have one with not a so good chip, he has done with +Mhz on [email protected],081V that are ca. 30Mhz less than with a "point on mhz" 1,1V.
> An then 450W by a bad Chip is not enough.
> View attachment 2463555






live pushed memory to the end of the Msi AB slider,and beside the gpuz si the power supply application icue indicating in and out power of the whole system because gpuz dont indicate actual tdp after shunt modding.


----------



## PhoenixMDA

I think its a good way, for a rough assessment of Chip Qualität, some Chips perhaps need a little bit more Power or will be more hot by same max. boost Frequency.

By RTX 2000'er it was round about 200Mhz.


----------



## BluePaint

Interesting test. My card (MSI Trio + Strix 450W bios) holds between 2340 and 2325 when < 30 celsius (open window air cooled).
I wonder what influences the FPS the most, besides the GPU. 7000 fps vs 5000 fps is quite a lot. CPU overclock from stock to 4725Mhz for max core gave me about 10% higher FPS. I guess it likes Intel CPUs better and RAM speed might also have an influence. Or is the monitor refresh rate somehow relevant too?











I also thought that the voltage slider will not really have an effect for a 3080 but with +50% (+100% had more stability issues in quick testing) the GPU managed an average of 2163Mhz vs 2133Mhz from before in Port Royale:
12777 Port Royale, 2163Mhz avg


----------



## DueAlian

dr.Rafi said:


> View attachment 2463526


thanks for this video! Now i want to watercool my msi rtx 3080 ventus oc so badly! Which bios are you currently using? I tried different ones, now stuck with asus tuf oc bios which can only get me upto 330W for some reason. 


https://www.3dmark.com/spy/14817852


----------



## PhoenixMDA

@ BluePaint 
How much FPS you reach bei ATI Tool is a question of the CPU.My 24/7 is 9900k 5,2/4,8/4x8GB 4400CL17-17 65,5k Copy 35,5ns.


----------



## Edge0fsanity

DStealth said:


> No, probably 1-2 max 3 - 13mhz straps difference temeperature dependant.


That much is a given. What I'm really interested in is whether higher clocks are possible at a given voltage step. For example I can hit 2070mhz on air before temp throttling @ 1025mV. Can it go higher at that voltage on water? Also, does it get a higher voltage ceiling before power limit kicks in since its running much cooler? I'm thinking yes on this since I see a 20-30w difference when I first start a game before the card heats up.


----------



## DStealth

VRM's are more efficient with lower temps + fans are not consuming also. IMO @1025 you wont get any further than 2070mhz probably more constant. But yes of course with water and lower temps higher frequences with higher voltages are possible if the limit allows...


----------



## BluePaint

What does additional voltage with AB for Ampere or maybe Turing actually do? I did a quick google search but didn't find anything useful.




PhoenixMDA said:


> @ BluePaint
> How much FPS you reach bei ATI Tool is a question of the CPU.My 24/7 is 9900k 5,2/4,8/4x8GB 4400CL17-17 65,5k Copy 35,5ns.


Update: had the core affinity slightly wrong. With correct core affinity to the 4700Mhz cores only, the fps are almost 6000. That's about a 20% difference to [email protected] which makes sense. 10% clock + 10% single core advantage.

Will be interesting to test new Zen 3 CPU with it.

My RAM has copy of 63k but only 62,7ns due to Zen 2.








@DStealth 
Copy is probably as good as yours because RAM is dual rank compared to your single rank modules.
Your latency is fantastic. And great CPU too. Unfortunately, my memory controller doesn't seem to like anything above 1867.


----------



## DStealth

63k Copy with just 3,732 MHz very nice i get such with 100+mhz higher and very tight timings on AMD setup.


----------



## shallow_

Finally, my Asus 3080 Strix OC is reserved in stock at Komplett.no, a few more days now 

Man, Ive never had this much trouble spending $1000..


----------



## SoldierRBT

dr.Rafi said:


> even the gpu didnot start ,ican also take a snapshoot @2400 hz with immediate crash


+300 at 1.1v locked it would start at 2355MHz and drop to 2340MHz <32C after that it would hold 2325MHz. I took the photo at 32C to match Phoenix’s photo. Keep in mind my GPU is still on stock cooler and not shunt modded and water cooled like yours.


----------



## PhoenixMDA

BluePaint said:


> What does additional voltage with AB for Ampere or maybe Turing actually do? I did a quick google search but didn't find anything useful.
> 
> 
> 
> Update: had the core affinity slightly wrong. With correct core affinity to the 4700Mhz cores only, the fps are almost 6000. That's about a 20% difference to [email protected] which makes sense. 10% clock + 10% single core advantage.
> 
> Will be interesting to test new Zen 3 CPU with it.
> 
> My RAM has copy of 63k but only 62,7ns due to Zen 2.
> 
> 
> 
> 
> 
> 
> 
> 
> @DStealth
> Copy is probably as good as yours because RAM is dual rank compared to your single rank modules.
> Your latency is fantastic. And great CPU too. Unfortunately, my memory controller doesn't seem to like anything above 1867.


The difference of my old 24/7 5,24/4,82/4x8GB4400CL17-17 was in Games ca. 25% to an AMD 3900X max. out with awesome RamOC.
The AMD has more IPC Power but the Latency is to bad with 60ns to bring the Power in Games.If you think about it, how you can read and write less than you do copy?
I Think perhaps with cache Hits.
By Intel you need both Copy and Latency
with 4x8GB 4400CL15-15 65,5k Copy 34,1ns i have less power









then with 4x8GB 4533CL16-16 67,2k Copy 34,6ns at same CPU frequency (not GPU li









The coming 5900X will do this much more better, in Gaming Performance.

@SoldierRBT 
How i have said 2 Steps better 30Mhz then my Chip, but im satisfied with the Card i have a good Chip and no Throttle in Performance with +1400 on Mem.
I have also preorder 3080StrixOC and 3090StrixOC, but you Know...if i get my waterblock bevor the 3080Tuf will stay in my "machine".


----------



## dr.Rafi

BluePaint said:


> Interesting test. My card holds between 2340 and 2325 when < 30 celsius (open window air cooled).
> I wonder what influences the FPS the most, besides the GPU. 7000 pts vs 5000 pts is quite a lot. CPU overclock from stock to 4725Mhz for max core gave me about 10% higher FPS. I guess it likes Intel CPUs better and RAM speed might also have an influence. Or is the monitor refresh rate somehow relevant too?
> 
> View attachment 2463560
> 
> 
> 
> I also thought that the voltage slider will not really have an effect for a 3080 but with +50% (+100% had more stability issues in quick testing) the GPU managed an average of 2163Mhz vs 2133Mhz from before in Port Royale:
> 12777 Port Royale, 2163Mhz avg





PhoenixMDA said:


> If my Chip is under 32Grad i can hold 2310Mhz, it´s start by 2325 and go fast down to 2310 but there is the limit of my Chip.
> View attachment 2463527
> 
> 
> In 24/7 in hard Case Game like Hunt Showndown i´m a little bit unter my PT of ca. 355W by 0,8625V and have 1950Mhz stable.
> Under Water i think 426W PT is ok for that i have bought 25mOhm for 20% more PT.For 24/7 is this a good choice i think.
> 5 or 8mOhm is for my like open Drain and i dont want to blow up my Card.
> i Hope it´s enough for arround 2100+- by heavy Load Game.
> View attachment 2463528
> 
> Wonder which card and which bios on that photo?


----------



## dr.Rafi

DueAlian said:


> thanks for this video! Now i want to watercool my msi rtx 3080 ventus oc so badly! Which bios are you currently using? I tried different ones, now stuck with asus tuf oc bios which can only get me upto 330W for some reason.
> 
> 
> https://www.3dmark.com/spy/14817852


If you shunt mod you can use ftw3 ultra 450 or 400 watt but if you not the best was for me was aorus master ,using 2x 8 pin bios after shunt mode will give you 20 to 30 watt power lift but will still limit tha card and the second thing i noticed, the core clock will flectuate between 2d and 3d alot in certain applications with less load demand even though it wont affect your framerate , may be in my case i used 5milli ohm shunt if you use 25 or more you get less power pumping but less issues using 2x 8 pin bios .


----------



## dr.Rafi

PhoenixMDA said:


> @ BluePaint
> How much FPS you reach bei ATI Tool is a question of the CPU.My 24/7 is 9900k 5,2/4,8/4x8GB 4400CL17-17 65,5k Copy 35,5ns.


Also i noticed using curve in after burner will give you higher stable consistant clock but its drop my bench score in all 3d mark tests and superposition 2160 with curve give me less performance then 2115 without.


----------



## rambosbff

What's with ati tool showing up? Hasn't that been dead for 14-15years? Just curious!


----------



## gemini002

Yes got my strix no more flashing as I got the real deal now. Honestly, it's the best Bios IMO. Works amazingly with MSI Trio


----------



## PhoenixMDA

dr.Rafi said:


> Also i noticed using curve in after burner will give you higher stable consistant clock but its drop my bench score in all 3d mark tests and superposition 2160 with curve give me less performance then 2115 without.


I have only get one card the 3080TufOC from Cyberport Germany for 759,- euro, in preorder i have from Amazon 17.9. the 3090StrixOC and one 3080StrixOC from Cyberport.
The Card is Stock Bios PT ca. 355W.The Card´s here are also very rare or very expensive.

With normal Curve i get the best max. Score in 3DMark "benchstable" +240- +250 on GPU and +1400 Mem.But for 24/7 i take a fixed Clock under the PT by heavy Game Load.
You can get a little bit better score then you do in the nv driver AF off and Prerendering from 1 to 3 or 4.
This is with 355W, Shunt Mod i do then i get the Waterblock.









@rambosbff
ATI Tool is only nice to compare the Chip´s for nothing more.I take this since ATI 9700 pro.


----------



## rjrusek

Getting my RTX 3080 MSI Gaming Trio soon..

What is the version of the latest STRIX OC bios that I can use? Can it be downloaded from TechPowerUp? Also is a power shunt required or just doing the bios flash will due?

Thank you in advance.
RJR


----------



## ssgwright

if anyone is looking for a reference 3080 alphacool waterblock let me know... I bought it not knowing it wouldn't fit a TUF


----------



## dr.Rafi

PhoenixMDA said:


> I have only get one card the 3080TufOC from Cyberport Germany for 759,- euro, in preorder i have from Amazon 17.9. the 3090StrixOC and one 3080StrixOC from Cyberport.
> The Card is Stock Bios PT ca. 355W.The Card´s here are also very rare or very expensive.
> 
> With normal Curve i get the best max. Score in 3DMark "benchstable" +240- +250 on GPU and +1400 Mem.But for 24/7 i take a fixed Clock under the PT by heavy Game Load.
> You can get a little bit better score then you do in the nv driver AF off and Prerendering from 1 to 3 or 4.
> This is with 355W, Shunt Mod i do then i get the Waterblock.
> View attachment 2463622
> 
> 
> @rambosbff
> ATI Tool is only nice to compare the Chip´s for nothing more.I take this since ATI 9700 pro.


PT ? ca ?what mean ?
And for time spy i get agood grapgic score 19750 but total is bad because of cpu.


https://www.3dmark.com/spy/14749694


----------



## gemini002

rjrusek said:


> Getting my RTX 3080 MSI Gaming Trio soon..
> 
> What is the version of the latest STRIX OC bios that I can use? Can it be downloaded from TechPowerUp? Also is a power shunt required or just doing the bios flash will due?
> 
> Thank you in advance.
> RJR


yes you can dl for tech power up


----------



## PhoenixMDA

@dr.Rafi 
I mean with PT Powerlimit/Powertarget sry for my english.The TufOc has 375W if you read out the Bios, but it´s only round about 355W, not how the FE there are 370W really 370W.
Your graphics Score is good, your CPU Score can also be good with "Allcore OC", with Samsung B-DIE optimized incl. Sub´s the most are over 14k Points there.

@rjrusek 
No Shunt Mod required, this is a card with 3x8Pin connector.


----------



## Shocchiz

Hi, I was able to order a 3080 PNY, it will arrive in 2 days.
According to the 1st page chart is a reference/2 connectors/350W board, and I'm trying to figure out what to do with it's original bios.

I'm reading the the thread and some say to use for reference boards the Palit OC bios, others seem to use the Aorus one (that's not clear to me).
The Palit OC bios should have the same 350W power limit as the PNY one, so I guess there's no point in flashing, am I right?
The Aorus Master has a 370W PL but it's a custom board, is it ok to flash it on a reference one?
Thanks in advance for the answers.


----------



## DStealth

PhoenixMDA said:


> @dr.Rafi
> Your graphics Score is good, your CPU Score can also be good with "Allcore OC", with Samsung B-DIE optimized incl. Sub´s the most are over 14k Points there.


I have 15.5k with 3900x on my old 1080ti TimeSpy run








It was actually higher with the same card than my [email protected] with 4700c16 memory  https://www.3dmark.com/compare/spy/12479317/spy/12695670

Edit: I gave up with Palit 3080 and left it in my sons PC awaiting tomorrow FTW3 Ultra and mounting water on it...Let's hope e a good GPU will come.


----------



## PhoenixMDA

@DStealth 
Very nice by AMD, but your 10900k is to low, with 5,4Ghz [email protected] you must have over 16,5k.
With 10900k [email protected] and Sub´s it´s 4Way Interleaving its arround 17k.
You dont have change the Sub´s, so it dont bring speed.Look at my Subs, if you want to feed Ampere @WQHD HighFPS it´s necessary.


----------



## DStealth

O thanks, was wondering how do downclock c14 4800 to 4600c17 but get higher 3dmark CPU scores


----------



## PhoenixMDA

4800C14 is for superpi etc. benchs.
With 2Way Interleaving you have the better Latency, with 4Way you have more bandwith.


----------



## dr.Rafi

PhoenixMDA said:


> @dr.Rafi
> I mean with PT Powerlimit/Powertarget sry for my english.The TufOc has 375W if you read out the Bios, but it´s only round about 355W, not how the FE there are 370W really 370W.
> Your graphics Score is good, your CPU Score can also be good with "Allcore OC", with Samsung B-DIE optimized incl. Sub´s the most are over 14k Points there.
> 
> @rjrusek
> No Shunt Mod required, this is a card with 3x8Pin connector.


Thanks , your English is good, its my bad ,i used to overclock 14 years ago on Xtreme Systems, with phase change cooling but stopped and recently back again with more free time.
Regarding founder edition, This release Nvidia is approaching the end of moors law, so they trying to be the best card designer not only chip makers , their cards are so different in everything they gave a stupid reference design for their partners , and made them even uncapable to test their cards in real world.


----------



## dr.Rafi

DStealth said:


> I have 15.5k with 3900x on my old 1080ti TimeSpy run
> 
> 
> 
> 
> 
> 
> 
> 
> It was actually higher with the same card than my [email protected] with 4700c16 memory  https://www.3dmark.com/compare/spy/12479317/spy/12695670
> 
> Edit: I gave up with Palit 3080 and left it in my sons PC awaiting tomorrow FTW3 Ultra and mounting water on it...Let's hope e a good GPU will come.


What is wrong with Your Palit


----------



## dr.Rafi

Today will receive 50 Mlcc capacitors and the last step to mod ventus and compare results.


----------



## dr.Rafi

DStealth said:


> I have 15.5k with 3900x on my old 1080ti TimeSpy run
> 
> 
> 
> 
> 
> 
> 
> 
> It was actually higher with the same card than my [email protected] with 4700c16 memory  https://www.3dmark.com/compare/spy/12479317/spy/12695670
> 
> Edit: I gave up with Palit 3080 and left it in my sons PC awaiting tomorrow FTW3 Ultra and mounting water on it...Let's hope e a good GPU will come.


That on all cores? 4720!!!, best i can do on all cores 4350
Very impressive results for both CPUs especially you using cheap motherboards for both well done that is the fun of overclocking, not paying for premium and get best scores so easy.


----------



## Zeakie

Is shunt modding the same process for every card or is it card specific? Wondering if to on my trinity oc


----------



## spajdr

Guys, anyone found a bios for Gainward Phoenix (GS or nonGS) that actually raise a power limit (to more than 350W) ?


----------



## DueAlian

I was wondering about the same for my MSI ventus oc. Can't get it above 330W with any bios(currently on asus tuf oc). Interestingly even with power limit slider set to max(which should allow 375W) i get Power limit as soon as gpu hits 330W in games(using msi afterburner). Also i can't flash nvidia reference bios on it.


----------



## DokoBG

ZealotKi11er said:


> Just hit 1 month waiting for my order in Canada. This is not looking good. It would be sad if I can get 6800 XT before RTX 3080.


I got my 3080 FTW 3 Ultra from Memory Express this Monday !!! I pre-ordered on Sept 19.


----------



## dr.Rafi

DueAlian said:


> I was wondering about the same for my MSI ventus oc. Can't get it above 330W with any bios(currently on asus tuf oc). Interestingly even with power limit slider set to max(which should allow 375W) i get Power limit as soon as gpu hits 330W in games(using msi afterburner). Also i can't flash nvidia reference bios on it.


Only way is shunt mod , if you dont want go so high 400 + in power you can shunt one 8 pin rail 2 shunt resistors only that will give you close to 390 or 400 watt max, but always check how much is your total system power draw from wall socket under load before mod and after and add the difference to the total graphic card consumption you had before mod.


----------



## dr.Rafi

Zeakie said:


> Is shunt modding the same process for every card or is it card specific? Wondering if to on my trinity oc


It is basically the same but the arrangement of shunt resistors different from card to card with multimeter you can find which one you want to mod


----------



## DStealth

PhoenixMDA said:


> 4800C14 is for superpi etc. benchs.
> With 2Way Interleaving you have the better Latency, with 4Way you have more bandwith.


Do you have any results with 10900k Timespy 16.5 or 17k CPU or just interpolating from your 9900k ? Because I've got ~13.5k with 9900 but cannot exceed 15k with 10900 no matter it has 25% more cores...
here's my bandwith










dr.Rafi said:


> What is wrong with Your Palit


Nothing wrong...it's just 2*8pin and don't wanna mount water block on it as the power is limiting factor even shunting it. Have the oportunity to get FTW3 Ultra and for sure will be better out of the box


----------



## cstkl1

5900x beats the **** out of 10900k right

so out of the 10 6800xt vs 3080x .. 4 of them has inbuilt benchmark
i can only test 3

so lets see

*10900k-51|48
2x16gb 4400c18
RTX3080 nerfed to 320w limit with max temp 68c rads nerfed to 200rpm.. pump nerfed also to lowest d5
windows on a typical bs background nonsense.. armoury crate, steam, afterburner, etc etc etc so no optimized windows here...*












Spoiler


----------



## cstkl1

finally can play this game with decent frames...


----------



## BluePaint

@cstkl1 Good test!

Pragmatically, I think the better the performance of the new AMD cards, the better for NVIDIA owners too.
Either NVIDIA is forced to bring out sth like a 3080Ti with 12GB VRAM and/or they need to lower prices a bit (seems unlikely atm).

Also, with DLSS NVIDIA still has the software advantage. If Cyberpunk 2077 for example, has good DLSS implementation, that will get u +30% performance or more with very similar visuals.

But I would also love to use 2x6800XT CF for some older games which work well with it (e.g. Total War Warhammer 2) to game in 8K via VSR (because AA is garbage in Total War games and visuals have tons of small details prone to flickering).

My 2x1080Ti perform similar to the 3080 in titles with good SLI support and 1080Ti has 1GB VRAM more.
And my 2x1080Ti Timespy scores are also higher


----------



## DueAlian

BluePaint said:


> @cstkl1 Good test!
> 
> Pragmatically, I think the better the performance of the new AMD cards, the better for NVIDIA owners too.
> Either NVIDIA is forced to bring out sth like a 3080Ti with 12GB VRAM and/or they need to lower prices a bit (seems unlikely atm).
> 
> Also, with DLSS NVIDIA still has the software advantage. If Cyberpunk 2077 for example, has good DLSS implementation, that will get u +30% performance with very similar visuals.
> 
> But I would also love to use 2x6800XT CF for some older games which work well with it (e.g. Total War Warhammer 2) to game in 8K via VSR (because AA is garbage in Total War games and visuals have tons of small details prone to flickering).
> 
> My 2x1080Ti perform similar to the 3080 in titles with good SLI support and 1080Ti has 1GB VRAM more.
> And my 2x1080Ti Timespy scores are also higher


Absolutely agree with everything you touched on here. I was playing watch dogs legion yesterday and DLSS on performance mode doubled fps (from 44 to 88 at 4k ultra with ray tracing on). I can't agree more with what you said regarding CF/SLI. Nvidia really made me mad by not supporting SLI after 1080Ti. I used to have 2x Msi gtx 1080Ti gaming Xs and the most games I was playing supported it well. Then I purchased rtx 2080Ti, which I don't really regret, because I got my [email protected] monitor back then and that was the best experience. I just think it is a little bit too greedy to not support SLI on 3080 so they can sell 10% faster 3090 for more than double the price. I had no concerns paying £1500 for the 1080Tis, but i didn't feel the same for 3090 and got 3080. I think AMD has opted for the same path, even though 6900 is much cheaper than 3090, it is still marginally expensive compared to 6800XT which sounds like the best choice considering 6800 is priced higher than 3070 and specs look considerably weaker than 6800XT. I might get 5900 and 6800XT next month and see how it performs against 3080


----------



## cstkl1

BluePaint said:


> @cstkl1 Good test!
> 
> Pragmatically, I think the better the performance of the new AMD cards, the better for NVIDIA owners too.
> Either NVIDIA is forced to bring out sth like a 3080Ti with 12GB VRAM and/or they need to lower prices a bit (seems unlikely atm).
> 
> Also, with DLSS NVIDIA still has the software advantage. If Cyberpunk 2077 for example, has good DLSS implementation, that will get u +30% performance with very similar visuals.
> 
> But I would also love to use 2x6800XT CF for some older games which work well with it (e.g. Total War Warhammer 2) to game in 8K via VSR (because AA is garbage in Total War games and visuals have tons of small details prone to flickering).
> 
> My 2x1080Ti perform similar to the 3080 in titles with good SLI support and 1080Ti has 1GB VRAM more.
> And my 2x1080Ti Timespy scores are also higher


from what i can see.. its only worth considering amd if you are gonna upgrade to ryzen 5k series..
it seems to be hinging on that.

but whether all their for said tech actually works since its all driver/software based.. thats another question cause freesync till date is ****. heck gsync comp version is way better since nvidia test them more often and release driver to panel specific fixes often.. amd nada so makes you wonder .. are they blaming scalars on the monitors or monitor vendors...

i will be getting either the 6800xt or 6900xt.. but look like more to the former cause need to buy the cpu and mobo ( b550 itx strix.. yes this is the least problematic mobo in amd whole lineup and afaik the only mobo all ryzen 3 series boost at rated clock on default)... crosshair hero dark.. hmmm if its using the same optimem layout as z490 hero.. than no thank you... 

gonna be interesting to see how far reviewers lie. have you guys noticed all are keeping mum on the selective ******ness of amd rdna 2 slides
why no 5700xt comparison in fps.. cause that will be the same driver and we can extrapolate that.. and just one slide with 6800/6800xt/6900xt.. 
so all of them are just got covid like stupid or what??
even when iphone 12 launch we all know just by apple NOT talking about battery we know it sux.. but reviewers who didnt get it started talking great things about it.. how much payola going around with ppl who are claiming above board??


----------



## cstkl1

DueAlian said:


> Absolutely agree with everything you touched on here. I was playing watch dogs legion yesterday and DLSS on performance mode doubled fps (from 44 to 88 at 4k ultra with ray tracing on). I can't agree more with what you said regarding CF/SLI. Nvidia really made me mad by not supporting SLI after 1080Ti. I used to have 2x Msi gtx 1080Ti gaming Xs and the most games I was playing supported it well. Then I purchased rtx 2080Ti, which I don't really regret, because I got my [email protected] monitor back then and that was the best experience. I just think it is a little bit too greedy to not support SLI on 3080 so they can sell 10% faster 3090 for more than double the price. I had no concerns paying £1500 for the 1080Tis, but i didn't feel the same for 3090 and got 3080. I think AMD has opted for the same path, even though 6900 is much cheaper than 3090, it is still marginally expensive compared to 6800XT which sounds like the best choice considering 6800 is priced higher than 3070 and specs look considerably weaker than 6800XT. I might get 5900 and 6800XT next month and see how it performs against 3080


eh.. its out.. thought it will be later today only damn.. i am still practising to get good in ghostrunner which is awesome for hand/eye movement exercise... Wdoggy only preloaded it couple of days back...


----------



## cstkl1

Watchy doggy legion 1440p maxed out with dlss quality and everything maxed out detail

10900k+rtx3080


----------



## cstkl1

ignore the fps but this what the game looks at
ultra + ultra Rt + dlss quality + max detail @1440p

based on search my fps on previous post higher. could it be i am still using the day one unerfed drivers??


----------



## cstkl1

could it be thats y amd has lower score on their slides??


----------



## DueAlian

By the way, I didn't see anyone talking about HDMI 2.1 on new AMD 6000 gpus...


----------



## spajdr

*cstkl1 Division 2 have trial version, that should have benchmark, could you retest it too please?*


----------



## DueAlian

Managed to get myself rtx 3070 from Amazon... I am surprised that it is actually sold from the US stock... Even with the import fees its still cheaper... just have to wait longer, didn't want to pay additional £10


----------



## spajdr

@DueAlian Nice mate, but wrong forum


----------



## cstkl1

spajdr said:


> *cstkl1 Division 2 have trial version, that should have benchmark, could you retest it too please?*


will do it in abit.. seems small..


----------



## cstkl1

DueAlian said:


> By the way, I didn't see anyone talking about HDMI 2.1 on new AMD 6000 gpus...


did we even see an actual card?? physical??...


----------



## DueAlian

cstkl1 said:


> did we even see an actual card?? physical??...


I think she was holding one in her hand. I am just surprised no one mentioned HDMI 2.1 at all. It would be a real deal-breaker for me at least. The main and probably only reason I purchased a custom built pc just to get my hands on rtx 3080 is HDMI 2.1. The performance is great too but I wouldn't upgrade over my rtx 2080Ti otherwise. Its just [email protected] on an LG C9 OLED is irresistable.


----------



## spajdr

cstkl1 said:


> will do it in abit.. seems small..


Cheers
By the way I posted your results on other forums and people thinks that this increase is mainly having DDR4 running at 4400Mhz compared to AMD system which was running at 3200Mhz


----------



## cstkl1

spajdr said:


> *cstkl1 Division 2 have trial version, that should have benchmark, could you retest it too please?*


had to restart comp.. change rad fans.. heat up etc than nerf the card test to see it perform like a stock FE boostingw tih 320w limit
amd didnt like its spot on

138 fps



Spoiler















think they claimed 6800xt did 133 and nvidia was a bit more...


----------



## cstkl1

spajdr said:


> Cheers
> By the way I posted your results on other forums and people thinks that this increase is mainly having DDR4 running at 4400Mhz compared to AMD system which was running at 3200Mhz


funny how ppl always mention this. isnt this against a 5900xt that suppose to BLOW a 10900k out of the water and create world records..... its at 1.4v.. i didnt go overboard running [email protected] that they can complain about but 4266-4400 on z490 is akin to 3600-3800 for ryzen...also bro fps on ram doesnt increase on avg much. its more on the minimum fps...


----------



## spajdr

cstkl1 said:


> funny how ppl always mention this. isnt this against a 5900xt that suppose to BLOW a 10900k out of the water and create world records..... its at 1.4v.. i didnt go overboard running [email protected] that they can complain about but 4266-4400 on z490 is akin to 3600-3800 for ryzen...also bro fps on ram doesnt increase on avg much. its more on the minimum fps...


I know mate, but sometimes you can't change how people think, they simply believe in what they want


----------



## cstkl1

updated

did the conditions as amd stated 












Spoiler


----------



## GTANY

cstkl1 said:


> updated
> 
> did the conditions as amd stated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler


What is your RTX 3080 frequency on these benchmarks ?


----------



## cstkl1

GTANY said:


> What is your RTX 3080 frequency on these benchmarks ?


it fluxes between 18xx to 1980


----------



## PhoenixMDA

dr.Rafi said:


> Thanks , your English is good, its my bad ,i used to overclock 14 years ago on Xtreme Systems, with phase change cooling but stopped and recently back again with more free time.
> Regarding founder edition, This release Nvidia is approaching the end of moors law, so they trying to be the best card designer not only chip makers , their cards are so different in everything they gave a stupid reference design for their partners , and made them even uncapable to test their cards in real world.


Thanks, yes whats true, but i must say the NVidia FE VRM is really nice, better than the most custom, only one bad thing is the temp Hotspot by Mem.
But the Performance difference [email protected] to the [email protected] is not so much, as shown.


----------



## PhoenixMDA

DStealth said:


> Do you have any results with 10900k Timespy 16.5 or 17k CPU or just interpolating from your 9900k ? Because I've got ~13.5k with 9900 but cannot exceed 15k with 10900 no matter it has 25% more cores...
> here's my bandwith
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nothing wrong...it's just 2*8pin and don't wanna mount water block on it as the power is limiting factor even shunting it. Have the oportunity to get FTW3 Ultra and for sure will be better out of the box


The Most from HWL are between 16,5-17k CPU Score, at example Snakeeyes with 2x16GB 4400CL16 need´s ca. 100Mhz more CPU Clock for nearly the same as yoshimura with [email protected]
Here is his Score with 4400 4-Way
https://www.3dmark.com/spy/13683861
And here is that you can reach stable if you have a good sample of Apex/CPU/Mem it´s also from Yoshimura, he has the most luck and can boot up to 4700 and 4600 24/7.
At this Frequency by 2x16GB DR it´s better to fixed the ODT´s, but it´s not necessary to set Slope´s for Full stable like in my Setting.
He has a CPU Score of CPU score 16783 by 5,3Ghz it´s without PPD 0.









With PPD0 TXP4 you can get better Latency like in my Screen with 31,7ns and look at my Geekbench Mem Score with 5,4Ghz^^
System manufacturer System Product Name - Geekbench Browser









24/7 in TimSpy i´m arround 14K Timespy.My best at 5,45Ghz 4400CL17-17 was 14,5k but i can do this better if i want^^I think between 14,6-14,7k is the end with my [email protected] with higher "MemScore".
https://www.3dmark.com/spy/10264176


----------



## shallow_

Sucks that Asus is not doing an AIO version of the 3080  that 6800xt lcs is looking sweet..


----------



## DueAlian

Anyone else with ryzen 5 3600xt? Bottlenecks when playing Battlefield V at 4k ultra settings, fps drops all the time...probably related to the gpu utilization drop(it hits as low as 60% sometimes, shouldn't happen at 4k)


----------



## rankftw

Is it safe to flash the Gigabyte Gaming OC Bios to my Palit Gaming Pro?


----------



## VPII

rankftw said:


> Is it safe to flash the Gigabyte Gaming OC Bios to my Palit Gaming Pro?


You can do it, but it will not give you more power. I've flashed so many bioses on my Palit but still 320watt clock drop.


----------



## DStealth

I tried also every single one. If you can sacrifice DP ports GB Aorus Master can consume ~350w but spikes from 320-330 and onwards. Probably the updated 350w Palit Gaming OC BIOS is better for 24/7 use.


----------



## VPII

Interestingly what I noticed with the Palit Gamingpro OC is when running fully stock just increased power limit and I run Shadow of Tomb Raider at 1080P to check clock speeds as it would mostly stay around the max boost speed I'll have it sit between 2025 and 2055mhz not that is with the stock 1740mhz clock. Now I sold this card to get the Gigabyte Eagle OC and this card has a 1755mhz core clock out the box. However, this card when I run SOTR the same as above it would be around 1980 to 1920 clock speeds. Not sure how that is possible, but it seems the boost clocks on this card works differently. I will test it when I flash it with the Gaming OC bios as I found I can reach the best clock speeds with this bios but still not at the level the Palit was. This card seem to be limited to 2085mhz max, but mostly 2070mhz.


----------



## VPII

Okay so I just tested now with the Gaming OC bios from Gigabyte which basically has a 1800mhz core stock clock, now when running SOTR at 1080P as stated above, it only boost up to 2010mhz briefly. This means that the boost is only 210mhz instead of the usual 300 to 315mhz as I have seen with the Palit. Just does not make sense to me.


----------



## dr.Rafi

Zelo said:


> 3080 Aorus Master owner here. Why am I getting a lower Time Spy score when I OC to +100/+700.
> 
> Pre OC Time Spy Graphics Score: 18080 -- Overall: 17173
> OC(+100/+700) Graphics Score: 16817 -- Overall: 16161
> 
> I ran it a few times and got similar results.


Hi Zelo, any updates about your scores with aorus master 3080 
now ?


----------



## Vapochilled

VPII said:


> Okay so I just tested now with the Gaming OC bios from Gigabyte which basically has a 1800mhz core stock clock, now when running SOTR at 1080P as stated above, it only boost up to 2010mhz briefly. This means that the boost is only 210mhz instead of the usual 300 to 315mhz as I have seen with the Palit. Just does not make sense to me.


Flash the Gaming OC bios, set a custom curve with 0.937 to 1950mhz and like 0.91v to 1920 or 1905 with 0.887 i guess
During a 4k Benchs in 3Marks - Spy and firestk clocks on me dont go bellow 1900 and hit 360W


----------



## Vapochilled

I also think soon or later a bios with more power should arrive due to AMD competition. They will need to arrange a way to increase 3080 power.

Its a shame AORUS master is still 370W ... 
It would be nice to see more from it


----------



## Shocchiz

Shocchiz said:


> Hi, I was able to order a 3080 PNY, it will arrive in 2 days.
> According to the 1st page chart is a reference/2 connectors/350W board, and I'm trying to figure out what to do with it's original bios.
> 
> I'm reading the the thread and some say to use for reference boards the Palit OC bios, others seem to use the Aorus one (that's not clear to me).
> The Palit OC bios should have the same 350W power limit as the PNY one, so I guess there's no point in flashing, am I right?
> The Aorus Master has a 370W PL but it's a custom board, is it ok to flash it on a reference one?
> Thanks in advance for the answers.


No one can give me an advice?
The card arrived but it seems to be stuck around 320W with default bios.w


----------



## DStealth

PhoenixMDA said:


> System manufacturer System Product Name - Geekbench Browser


Lol what  You're not making difference between 2 slot top model Apex for 500+ euro and 4 dimm GB chepest ~200euro MB benchwise 
but here's my score no matter - Gigabyte Technology Co., Ltd. Z490 AORUS ELITE AC - Geekbench Browser


----------



## mattxx88

Shocchiz said:


> No one can give me an advice?
> The card arrived but it seems to be stuck around 320W with default bios.w


what matters is the power connectors, your card have 2 8pin so choose the higest 2x8pin bios card to flash


----------



## spajdr

Shocchiz said:


> No one can give me an advice?
> The card arrived but it seems to be stuck around 320W with default bios.w


If Gigabyte gaming OC or AORUS bios one makes no difference on your card, then you are out of luck I'm afraid.


----------



## VPII

spajdr said:


> If Gigabyte gaming OC or AORUS bios one makes no difference on your card, then you are out of luck I'm afraid.


I have to agree with you. I have seen it now with two RTX 3080 models, the Palit Gamingpro OC and now the Gigabyte Eagle OC. The Palit will always be limited to 320watt even with the 9% increased power limit although it would consume more when increased and the Eagle OC as normal limited to 340watt will always stay at 340watt even with the 370watt Gaming OC bios, although consuming more power.


----------



## PhoenixMDA

DStealth said:


> Lol what  You're not making difference between 2 slot top model Apex for 500+ euro and 4 dimm GB chepest ~200euro MB benchwise
> but here's my score no matter - Gigabyte Technology Co., Ltd. Z490 AORUS ELITE AC - Geekbench Browser


No so i dont have it mean, i want to say this is the max of this platform.

I have done a little bit more PT to my 3080TufOC 426W, but without Waterblock it´s....^^
In 24/7 i think 2,[email protected],975V i can do in Heavy Load Games, that is ok.


----------



## DStealth

wow great score over 20k TS with 2*8pin card ...on air. Your GPU is very, very good.


----------



## Vapochilled

PhoenixMDA said:


> No so i dont have it mean, i want to say this is the max of this platform.
> 
> I have done a little bit more PT to my 3080TufOC 426W, but without Waterblock it´s....^^
> In 24/7 i think 2,[email protected],975V i can do in Heavy Load Games, that is ok.
> View attachment 2463965


Shunt node ?426w is way high for the tuf


----------



## PhoenixMDA

DStealth said:


> wow great score over 20k TS with 2*8pin card ...on air. Your GPU is very, very good.


No i have a good one but not very good, Soldier´s Chip is much better 30-45Mhz and he has the better VRM.
Th bench is with Prerendering on 4 and with AF OFF, this do 300 Points more than basic.



Vapochilled said:


> Shunt node ?426w is way high for the tuf


Yes with 25mOhm lik i have said, that´s 20% more Power, that´s ok for this VRM, but if you want the best efficiens of VRM it´s better to go under Water, im waiting for my Waterblock.
The Tuf has 20x55A mosfet´s the best efficiens is at 20A to 25A, for mem 4 Powerstages ca. 80-90W for the rest 16 Powerstages.
Ca. avg 350W on the 16 Powerstages an with Peak´s ca. 370W this :0,7V worst case Szenario are avg 31,25A and Peak´s 33A.
In 24/7 i drive with 0,95-1V that is between 20-25, so it´s full ok.

Here the Power by TimeSpy Extrem you must multiply *1,2 with 25mOhm Shunt´s, you see also PCI, is like i have think max. arround 70W no Problem.
That is more a 24/7 shunt mod.


----------



## dr.Rafi

rankftw said:


> Is it safe to flash the Gigabyte Gaming OC Bios to my Palit Gaming Pro?


its safe to flash any 3080 to any 3080
though iam NOT responsible for any screwing other do 💨


----------



## hemon

Hi,

someone archieved a solid (!) OC @2070Mhz with undervolting? At what voltage? I´m now at 1.062V and it is 95% solid with peak of 2085/2100Mhz. I have the Strix. Any suggestions?


----------



## parcher

[QUOTE = "Shocchiz, post: 28661410, membro: 473947"]
Nessuno può darmi un consiglio?
La scheda è arrivata ma sembra essere bloccata intorno a 320W con bios.w predefinito
[/ CITAZIONE]

Flasha il Bios Asus TUF OC ..


----------



## Daemon_xd

Does anybody have Palit RTX 3080 GameRock OC? This card has 3 8 pin connectors so it can be flashable with asus strix bios or evga ftw3, but i didn't find any information about any performance uplifts with these bioses


----------



## Shocchiz

spajdr said:


> If Gigabyte gaming OC or AORUS bios one makes no difference on your card, then you are out of luck I'm afraid.


Thanks for your reply.
I tried the Gaming OC bios, but got similar results as the original one, so I put it back.

Now I'm wondering, what are considered good clocks and voltages at full load (I mean 100% load)?


----------



## dr.Rafi

PhoenixMDA said:


> The Most from HWL are between 16,5-17k CPU Score, at example Snakeeyes with 2x16GB 4400CL16 need´s ca. 100Mhz more CPU Clock for nearly the same as yoshimura with [email protected]
> Here is his Score with 4400 4-Way
> https://www.3dmark.com/spy/13683861
> And here is that you can reach stable if you have a good sample of Apex/CPU/Mem it´s also from Yoshimura, he has the most luck and can boot up to 4700 and 4600 24/7.
> At this Frequency by 2x16GB DR it´s better to fixed the ODT´s, but it´s not necessary to set Slope´s for Full stable like in my Setting.
> He has a CPU Score of CPU score 16783 by 5,3Ghz it´s without PPD 0.
> View attachment 2463849
> 
> 
> With PPD0 TXP4 you can get better Latency like in my Screen with 31,7ns and look at my Geekbench Mem Score with 5,4Ghz^^
> System manufacturer System Product Name - Geekbench Browser
> View attachment 2463851
> 
> 
> 24/7 in TimSpy i´m arround 14K Timespy.My best at 5,45Ghz 4400CL17-17 was 14,5k but i can do this better if i want^^I think between 14,6-14,7k is the end with my [email protected] with higher "MemScore".
> https://www.3dmark.com/spy/10264176
> View attachment 2463852











I am trying to get there but still alot to learn this ram kit is rated [email protected] cl14, 4 years old.
Then bit more tweaking


https://www.3dmark.com/3dm/52382800




https://www.3dmark.com/pr/455583


----------



## PhoenixMDA

@dr.Rafi
Gigabyte so i know there is an issue by Bios, you cant fixed the RTL/IOL or set an Offset.So you cant get a very good latency.
But your Score is ok.
It´s very important to set all Sub´s like i have shown in my screen, if you get trouble to boot high Frequency in the most cases it help´s to search the right ODT´s.
The Second Thing is the best Performance on Daisy Chain is 2x16GB on Z490 if your Board can do this at high Speed like MSI Unify, Apex etc.(4400-4600Mhz)
With 4-Way(2x16GB DR, 4x8GB SR) you have with 200-300Mhz less the same Performance in the most cases as with 2-Way Interleaving (2x8GB SR).
Under active Air Cooling 1,5V for B-Die is ok.My Memory is extra selected and under Water so i only need for [email protected] 1,486V.
The People who drive 2x16GB [email protected],5xV have often test more Kit´s and CPU´s.









TXP and PPD0 it give´s on Asus Board and MSI Z490, i dont know by Gigabyte.On Asus Z390 you can only use Memtweakit for change the value of TXP and PPD.
On Asus you have much Value to get High Settings stable, Slope´s,ODT, BL.But it´s very hard to find the right value´s, ODT is the easiest.
Here is a Picture what i have summarized that explain that does it mean.Memory OC can take much time^^.
Slope´s are "Control Signal´s" and BL "Bitlines for Data".Nice to know, but i think you dont need it.
IO/SA, VDimm, Sub´s, ODT, RTL/IOL are the important things.


----------



## dr.Rafi

I noticed not always water cooling is good option for bench mark, with lower temp the core clock is boosting too high in the begening of each scene in timespy which give you less overclocking head room and cause crash at for example 100+ core clock, with higher temp the card is less prone to spike boost and give you stable 105 to 110 core clock. i test both scenarios with watercooling but in second test i lowered the water pump and fan speeds to minimum then torture test the graphic for awhile to lift the temp and then start 3dmark bench. any one with thourough test before and after watercooling can correct me please or my be is my card behaving this way.
i think the easy way to do it simulate thermal limit at lower temp to stop the core boosting spikes
because these cores are very temp boosting sensitive so the core is boosting at for example 26 c at scene start but the temp is spiking to 34 or 36 so the core cant cope with that clock and crash.


----------



## dr.Rafi

PhoenixMDA said:


> @dr.Rafi
> Gigabyte so i know there is an issue by Bios, you cant fixed the RTL/IOL or set an Offset.So you cant get a very good latency.
> But your Score is ok.
> It´s very important to set all Sub´s like i have shown in my screen, if you get trouble to boot high Frequency in the most cases it help´s to search the right ODT´s.
> The Second Thing is the best Performance on Daisy Chain is 2x16GB on Z490 if your Board can do this at high Speed like MSI Unify, Apex etc.(4400-4600Mhz)
> With 4-Way(2x16GB DR, 4x8GB SR) you have with 200-300Mhz less the same Performance in the most cases as with 2-Way Interleaving (2x8GB SR).
> Under active Air Cooling 1,5V for B-Die is ok.My Memory is extra selected and under Water so i only need for [email protected] 1,486V.
> View attachment 2464090
> 
> 
> TXP and PPD0 it give´s on Asus Board and MSI Z490, i dont know by Gigabyte.On Asus Z390 you can only use Memtweakit for change the value of TXP and PPD.
> On Asus you have much Value to get High Settings stable, Slope´s,ODT, BL.But it´s very hard to find the right value´s, ODT is the easiest.
> Here is a Picture what i have summarized that explain that does it mean.Memory OC can take much time^^.
> Slope´s are "Control Signal´s" and BL "Bitlines for Data".Nice to know, but i think you dont need it.
> IO/SA, VDimm, Sub´s, ODT, RTL/IOL are the important things.
> View attachment 2464091


Thanks for info and still learning too much info last weeks to learn and iam very new to last generation intel platform tweaking


----------



## dr.Rafi

PhoenixMDA said:


> @dr.Rafi
> Gigabyte so i know there is an issue by Bios, you cant fixed the RTL/IOL or set an Offset.So you cant get a very good latency.
> But your Score is ok.
> It´s very important to set all Sub´s like i have shown in my screen, if you get trouble to boot high Frequency in the most cases it help´s to search the right ODT´s.
> The Second Thing is the best Performance on Daisy Chain is 2x16GB on Z490 if your Board can do this at high Speed like MSI Unify, Apex etc.(4400-4600Mhz)
> With 4-Way(2x16GB DR, 4x8GB SR) you have with 200-300Mhz less the same Performance in the most cases as with 2-Way Interleaving (2x8GB SR).
> Under active Air Cooling 1,5V for B-Die is ok.My Memory is extra selected and under Water so i only need for [email protected] 1,486V.
> The People who drive 2x16GB [email protected],5xV have often test more Kit´s and CPU´s.
> View attachment 2464090
> 
> 
> TXP and PPD0 it give´s on Asus Board and MSI Z490, i dont know by Gigabyte.On Asus Z390 you can only use Memtweakit for change the value of TXP and PPD.
> On Asus you have much Value to get High Settings stable, Slope´s,ODT, BL.But it´s very hard to find the right value´s, ODT is the easiest.
> Here is a Picture what i have summarized that explain that does it mean.Memory OC can take much time^^.
> Slope´s are "Control Signal´s" and BL "Bitlines for Data".Nice to know, but i think you dont need it.
> IO/SA, VDimm, Sub´s, ODT, RTL/IOL are the important things.
> View attachment 2464091


"you have with 200-300Mhz less the same Performance" you mean less 200mhz for cpu speed or the memory speed ?


----------



## Dreamliner

Is there a vbios to increase the power limit of the TUF 3080?

I'm deciding between getting a TUF, STRIX or FTW3 card. I don't know what has a higher power limit between the STIX & FTW3? Is the only difference between the FTW3 GAMING & ULTRA the vbios or are the chips binned? Same question of the TUF regular & OC version...?

I'm not sure if these cards are getting binned or not but it seems like available power is the limiting factor for performance and even still the difference between a FE & top end card is pretty narrow. I'm not opposed to spending more money if it is worth it but it looks like the TUF 3080 is the best value. Right?


----------



## dr.Rafi

Dreamliner said:


> Is there a vbios to increase the power limit of the TUF 3080?
> 
> I'm deciding between getting a TUF, STRIX or FTW3 card. I don't know what has a higher power limit between the STIX & FTW3? Is the only difference between the FTW3 GAMING & ULTRA the vbios or are the chips binned? Same question of the TUF regular & OC version...?
> 
> I'm not sure if these cards are getting binned or not but it seems like available power is the limiting factor for performance and even still the difference between a FE & top end card is pretty narrow. I'm not opposed to spending more money if it is worth it but it looks like the TUF 3080 is the best value. Right?


only option is shunt mod, ithink no other brand have higher power bios for 2x8pins, shunt modding will give you lift , shunt modding+ high power 3xpins bios will give you crazy lift around 600 to 700 watt for card only but then the performance will depend on your gpu quality, i tried short shunt resistors on ventus, 2x8pins start pulling 800 watt for graphic card only on strix bios and trip my power supply ax1600i in benchmarks ,the total didnt pass what the power supply can handle but the rails was kicking over current protection😱.went back to reseonable shunts 🤯


----------



## dr.Rafi

PhoenixMDA said:


> @dr.Rafi
> Gigabyte so i know there is an issue by Bios, you cant fixed the RTL/IOL or set an Offset.So you cant get a very good latency.
> But your Score is ok.
> It´s very important to set all Sub´s like i have shown in my screen, if you get trouble to boot high Frequency in the most cases it help´s to search the right ODT´s.
> The Second Thing is the best Performance on Daisy Chain is 2x16GB on Z490 if your Board can do this at high Speed like MSI Unify, Apex etc.(4400-4600Mhz)
> With 4-Way(2x16GB DR, 4x8GB SR) you have with 200-300Mhz less the same Performance in the most cases as with 2-Way Interleaving (2x8GB SR).
> Under active Air Cooling 1,5V for B-Die is ok.My Memory is extra selected and under Water so i only need for [email protected] 1,486V.
> The People who drive 2x16GB [email protected],5xV have often test more Kit´s and CPU´s.
> View attachment 2464090
> 
> 
> TXP and PPD0 it give´s on Asus Board and MSI Z490, i dont know by Gigabyte.On Asus Z390 you can only use Memtweakit for change the value of TXP and PPD.
> On Asus you have much Value to get High Settings stable, Slope´s,ODT, BL.But it´s very hard to find the right value´s, ODT is the easiest.
> Here is a Picture what i have summarized that explain that does it mean.Memory OC can take much time^^.
> Slope´s are "Control Signal´s" and BL "Bitlines for Data".Nice to know, but i think you dont need it.
> IO/SA, VDimm, Sub´s, ODT, RTL/IOL are the important things.
> View attachment 2464091


The funny thing i am getting graphic isssues like chrome stuterring and close with 1.5 volt 4500 cl17 while 4400 is fully stable.


----------



## DStealth

For best mem oc get X BIOS from Hicookie








Dropbox - File Deleted


Dropbox is a free service that lets you bring your photos, docs, and videos anywhere and share them easily. Never email yourself a file again!




www.dropbox.com





And while @4500 rise VCCIO/SA a little from 4400 stable voltages also.


----------



## dr.Rafi

DStealth said:


> For best mem oc get X BIOS from Hicookie
> 
> 
> 
> 
> 
> 
> 
> 
> Dropbox - File Deleted
> 
> 
> Dropbox is a free service that lets you bring your photos, docs, and videos anywhere and share them easily. Never email yourself a file again!
> 
> 
> 
> 
> www.dropbox.com
> 
> 
> 
> 
> 
> And while @4500 rise VCCIO/SA a little from 4400 stable voltages also.


Thanks, you mean SA= System Agent ?
I am using 1.3 volt for both.
best for 4400:








any cgange to rtl or iol system not booting .


----------



## whipple16

Dreamliner said:


> Is there a vbios to increase the power limit of the TUF 3080?
> 
> I'm deciding between getting a TUF, STRIX or FTW3 card. I don't know what has a higher power limit between the STIX & FTW3? Is the only difference between the FTW3 GAMING & ULTRA the vbios or are the chips binned? Same question of the TUF regular & OC version...?
> 
> I'm not sure if these cards are getting binned or not but it seems like available power is the limiting factor for performance and even still the difference between a FE & top end card is pretty narrow. I'm not opposed to spending more money if it is worth it but it looks like the TUF 3080 is the best value. Right?


Not sure if this is what your talking about but afterburner lets me adjust the slider... GPUZ shows it at 117% also


----------



## Dreamliner

whipple16 said:


> Not sure if this is what your talking about but afterburner lets me adjust the slider... GPUZ shows it at 117% also


That's where the different BIOS options come in. I wonder if "117%" is absolute more power on an OC card vs a regular version because it is 117% of base level. For example, if one card is 100 watt, then 117% is 117 watts. If another card is 110 watts, then 117% of that is 128 watts.

I think the FTW3 card is probably the best card this time around, but I'm not completely sure if that is true (plus I know a higher power vbios exists for it). I like the look of the Strix the best, but like I said before, I don't know if it is worth it to spend any extra on this card. It just seems like there is very little headroom no matter what this generation.

I probably will never shunt mod. I just want to know the price/performance breakdown of all the different cards.

I'm fine paying the $50 extra for a TUF vs FE for the better/quieter cooler but then I am only $40 away from a FTW3 GAMING...which is only $30 away from a FTW3 ULTRA. See the dilemma?

Those FTW cards are pretty ugly...but this is kind of hard to pass up...


----------



## Dreamliner

I also wonder how far behind a 3080 Super/Ti or 4080 is. I think perhaps the Samsung yields are WAY less than NVIDIA was expecting and they might be looking to switch to TSMC sooner than later. Thoughts?


----------



## PhoenixMDA

dr.Rafi said:


> "you have with 200-300Mhz less the same Performance" you mean less 200mhz for cpu speed or the memory speed ?


I mean in Gaming benchmark mostly [email protected] is fast as 2x8GB between 4600-4700Mhz CL17-17 same Sub´s.

RTL/IOL on Gigabyte is i think not fixed.
Normal Case you can set the input value lower or giv an lower Offset, you can also set by your own, but it must be correct.
As example if i have 70/70/72/72 14/14/14/14 i can set 63/63/65/65 7/7/7/7, on Asus -6 on RTL and IOL will be ok in the most cases.


----------



## Nizzen

dr.Rafi said:


> its safe to flash any 3080 to any 3080
> though iam NOT responsible for any screwing other do 💨


It's not safe to be alive, so I guess flashing a bios is a bit safer 

I flashed my 3080 like 20 times, and it still works. So I guess it's safe enough for me. For you, I dunno.


----------



## dr.Rafi

Nizzen said:


> It's not safe to be alive, so I guess flashing a bios is a bit safer
> 
> I flashed my 3080 like 20 times, and it still works. So I guess it's safe enough for me. For you, I dunno.


may be more than 300 time some days 25 to 30 times . i tried all bios collection on techpowerup for rtx3080


----------



## GTANY

Dreamliner said:


> That's where the different BIOS options come in. I wonder if "117%" is absolute more power on an OC card vs a regular version because it is 117% of base level. For example, if one card is 100 watt, then 117% is 117 watts. If another card is 110 watts, then 117% of that is 128 watts.
> 
> I think the FTW3 card is probably the best card this time around, but I'm not completely sure if that is true (plus I know a higher power vbios exists for it). I like the look of the Strix the best, but like I said before, I don't know if it is worth it to spend any extra on this card. It just seems like there is very little headroom no matter what this generation.
> 
> I probably will never shunt mod. I just want to know the price/performance breakdown of all the different cards.
> 
> I'm fine paying the $50 extra for a TUF vs FE for the better/quieter cooler but then I am only $40 away from a FTW3 GAMING...which is only $30 away from a FTW3 ULTRA. See the dilemma?
> 
> Those FTW cards are pretty ugly...but this is kind of hard to pass up...


EVGA FTW3 is a bad choice on this generation compared to the TUF or the Strix : lower quality components, lower power limits than Strix (400 vs 450 W), higher power limit beta bios which does not work because of a hardware limit.

If you are willing to shunt mod : TUF
No shunt mod : Strix


----------



## acoustic

GTANY said:


> EVGA FTW3 is a bad choice on this generation compared to the TUF or the Strix : lower quality components, lower power limits than Strix (400 vs 450 W), higher power limit beta bios which does not work because of a hardware limit.
> 
> If you are willing to shunt mod : TUF
> No shunt mod : Strix


Lol what? The 450watt BIOS works as it's supposed to.


----------



## VPII

acoustic said:


> Lol what? The 450watt BIOS works as it's supposed to.


Not from what I have seen, they equalised the power limit which is why you won't get the full 450watt, I mean with the Evga FTW3 Ultra.


----------



## acoustic

VPII said:


> Not from what I have seen, they equalised the power limit which is why you won't get the full 450watt, I mean with the Evga FTW3 Ultra.


I have no idea where this originated from. I have the FTW3 Ultra and I'm pinging 450watt constantly in Metro Exodus. I've seen it hit over 450watt in transient spikes.


----------



## hemon

acoustic said:


> I have no idea where this originated from. I have the FTW3 Ultra and I'm pinging 450watt constantly in Metro Exodus. I've seen it hit over 450watt in transient spikes.


I wonder why I reach with the Strix about 430W and never 450W! Should I try to flash the bios of your card? How can I find it und how should I flash it?


----------



## SoldierRBT

I have the 3080 FTW3 Ultra with 450W BIOS and have no issues hitting 450W in Quake 2 RTX, Port Royal and Time Spy. I believe the problem is the 3090 FTW3 having issues hitting 500W with the beta BIOS.


----------



## Daemon_xd

Got myself 3080 Palit GameRock OC for 800$ without taxes here in Russia.








In GPU-Z it shows that the limit for this card is 440w, pretty nice that I don't need to flash bios.


----------



## Daemon_xd

Port Royal out of the box just increase power limit to max 


https://www.3dmark.com/3dm/52419705?


Is there any guide about curve oc in Msi Afterburner?


----------



## SoldierRBT

New personal record on air


https://www.3dmark.com/3dm/52423363?


Graphics Score: 20 447


----------



## ssgwright

SoldierRBT said:


> New personal record on air
> 
> 
> https://www.3dmark.com/3dm/52423363?
> 
> 
> Graphics Score: 20 447


NIce! best I can get is 19,400 and that's on water!


----------



## Talon2016

VPII said:


> Not from what I have seen, they equalised the power limit which is why you won't get the full 450watt, I mean with the Evga FTW3 Ultra.


Just so wrong. The 3080 does not have this issue, the 3090 has this issue with the 500w limit. The FTW3 3080 hits and exceeds 450w regularly. For the small price premium the FTW3 is a great choice for 3080. You don't need to flash off brand vbios, nor do you need to physically mod and ruin your warranty to get top tier performance out of a 3080.


----------



## dr.Rafi

Getting their 


https://www.3dmark.com/3dm/52427626


----------



## Senaxx

I searched the topic multiple times, but cannot find the right answer. I'm lucky enough to have a 3080 Founders Edition. Can this one also be flashed with other BIOSes from other cards? Or do I need an unlocked bios to boost the card more?


----------



## dr.Rafi

My card will go under extensive sugery today hope it will make it, and wish it quick recovery.


----------



## TK421

Daemon_xd said:


> Got myself 3080 Palit GameRock OC for 800$ without taxes here in Russia.
> View attachment 2464161
> 
> In GPU-Z it shows that the limit for this card is 440w, pretty nice that I don't need to flash bios.
> View attachment 2464162


how much is purchase tax normally in russia?


----------



## GTANY

On a Frame Chasers youtube video comments, I have read that a Bykski 3080/3090 strix waterblock owner says that the waterblock interferes with a fan header. So it's bending the pcb on the top when it's mounted.

Can you confirm ?


----------



## Nizzen

GTANY said:


> On a Frame Chasers youtube video comments, I have read that a Bykski 3080/3090 strix waterblock owner says that the waterblock interferes with a fan header. So it's bending the pcb on the top when it's mounted.
> 
> Can you confirm ?


I can confirm. Waiting fo EK strix block, and already ordered alphacool.

Bykski works, but it's far from safe to use 😅


----------



## GTANY

Nizzen said:


> I can confirm. Waiting fo EK strix block, and already ordered alphacool.
> 
> Bykski works, but it's far from safe to use 😅


OK, thank you. Consequently, I will not buy the Bykski waterblock. Is EK the only other manufacturer which makes 3080/3090 Strix waterblocks ?


----------



## Vapochilled

GTANY said:


> On a Frame Chasers youtube video comments, I have read that a Bykski 3080/3090 strix waterblock owner says that the waterblock interferes with a fan header. So it's bending the pcb on the top when it's mounted.
> 
> Can you confirm ?


Frame will frame you.
Some videos from him are so damn like target. He faked the xc3 2 pin bios working with 3 pin ftw3 bios. Doesnt provide links so that you give him money etc.
I would not watch or trust another video from that channel


----------



## Nizzen

GTANY said:


> OK, thank you. Consequently, I will not buy the Bykski waterblock. Is EK the only other manufacturer which makes 3080/3090 Strix waterblocks ?


Alphacool have 3080/3090 strix too.


----------



## Nizzen

Vapochilled said:


> Frame will frame you.
> Some videos from him are so damn like target. He faked the xc3 2 pin bios working with 3 pin ftw3 bios. Doesnt provide links so that you give him money etc.
> I would not watch or trust another video from that channel


Did he ever provide false information or lie?

Looks like he know more than 99.99% of people here


----------



## acoustic

Nizzen said:


> Did he ever provide false information or lie?
> 
> Looks like he know more than 99.99% of people here


He definitely claimed that a 2 8pin XC3 worked with a 3 8pin FTW3 bios, so yeah he kinda did. He's definitely thrown up a lot of clickbait garbage with the 30xx series release..


----------



## Daemon_xd

TK421 said:


> how much is purchase tax normally in russia?


About 30%, and we pay it when we purchase. So tax just adds to the price.


----------



## cstkl1

this game is awesome..

btw after the latest game patch the fps took a dive..

day 1.. the rtx was ok but DLSS performance was jaggy just like legion
then nvidia driver released.. 
everything was good and insane jump
then game had a patch..
fps took a nose dive...

superb ram tester and cpu cache.. for 48 i had to recalculate all the v/f for vcore min for LL3 ( was using LL6 before this)


----------



## TK421

Nizzen said:


> Did he ever provide false information or lie?
> 
> Looks like he know more than 99.99% of people here


the 3x8 bios on xc3 seems to be a pcie shunt problem


----------



## pompss

Getting the strix 3080 tomorrow any mod bios to push the power limit ?


----------



## acoustic

STRIX 3080 has the highest power limit on the market, same as the FTW3 Ultra @ 450watt.

Only option is to shunt mod right now.


----------



## cstkl1

found something weird with strix and should be worse for those on lower bios

so with ghostrunner 4k dlss quality.. without recording.. its hitting 450w spikes constant 420w [email protected]
but when i record via obs nvenc... .. the powerdraw is fixed at only 370w which is the 100% mark..cannot overide it..

odd...


----------



## dr.Rafi

The Ventus passed the surgery and recovered fast and now doing better.


https://www.3dmark.com/spy/14962146


----------



## dr.Rafi

GTANY said:


> On a Frame Chasers youtube video comments, I have read that a Bykski 3080/3090 strix waterblock owner says that the waterblock interferes with a fan header. So it's bending the pcb on the top when it's mounted.
> 
> Can you confirm ?


Even if that true happen to me with 2080ti, just try to pull the fan header plastic and keep it in the box in case you sell the car in future or for warrenty and put some kaptone tape around the fan header pins and bend them as far as they stop interfereing with the block, but be very carful not to break them and try to bend them slowly.


----------



## _Killswitch_

ugly pictures but think I offically join this club. I still have to get the single custom mini 12 pin cable bc that adapter is ugly =S plus clean up my other cables too.


----------



## VPII

acoustic said:


> He definitely claimed that a 2 8pin XC3 worked with a 3 8pin FTW3 bios, so yeah he kinda did. He's definitely thrown up a lot of clickbait garbage with the 30xx series release..





TK421 said:


> the 3x8 bios on xc3 seems to be a pcie shunt problem


You see, if the video was actually watched through you'll understand why the bios flash worked for him, yes he did also shunt mod the card.


----------



## PhoenixMDA

dr.Rafi said:


> The Ventus passed the surgery and recovered fast and now doing better.
> 
> 
> https://www.3dmark.com/spy/14962146
> 
> 
> View attachment 2464262


Nice Work, my very old Ersa Soldering Station is brocken at last work^^.
It was terrible to soldering the shunt, with 1mm tip, the station cant hold the temperature on the tip.Now i have bought a Ersa Nano Station, where i can set an Offset for the tip.
So next time it will be easy.What for shunt´s do you have solering?You Score seems not to be open drain.


----------



## dr.Rafi

cstkl1 said:


> this game is awesome..
> 
> btw after the latest game patch the fps took a dive..
> 
> day 1.. the rtx was ok but DLSS performance was jaggy just like legion
> then nvidia driver released..
> everything was good and insane jump
> then game had a patch..
> fps took a nose dive...
> 
> superb ram tester and cpu cache.. for 48 i had to recalculate all the v/f for vcore min for LL3 ( was using LL6 before this)


For me the frames is stuck on 60 /second even after disabling VS even in Nvidia control panel, and the game is running on direct x 11 mode and ray tracing not working asking me to run it in DX12 mode, not sure how to do that there is no menue otion for that and no lunch .exe for dx12.


----------



## dr.Rafi

PhoenixMDA said:


> Nice Work, my very old Ersa Soldering Station is brocken at last work^^.
> It was terrible to soldering the shunt, with 1mm tip, the station cant hold the temperature on the tip.Now i have bought a Ersa Nano Station, where i can set an Offset for the tip.
> So next time it will be easy.What for shunt´s do you have solering?You Score seems not to be open drain.


Was stressfull operation, mlcc are 1.25 x 2 mm itis easier to do it with soldering hot air, but its directly to the back of the gpu, i put k thermocouple on the gpu when apllied hot air on the back ,gpu temp jump to 150 , you see each 4 mlcc have big short pcb connection underneath, that is directly conected to the gpu, very hard to soften the solder on that point the gpu suck all the heat so worred to melt the gpu balls or even kill the gpu chip,i think in the factory they flow the components first then finally the gpu chip, btw i have to do it with the soldering station with 500 c 40 watt, alot of flux, after done complete wash the whole card electronic circuit cleaner(the pcb have many tiny holes on the back of the gpu area, ithink it is for letting the heat escape not sure though, so i dont want to block those holes with flux.

yes is shunted already. but maximum drain for the graphic card only is 390 to 430 during timespy and portroyal .


----------



## dr.Rafi

dr.Rafi said:


> Was stressfull operation, mlcc are 1.25 x 2 mm itis easier to do it with soldering hot air, but its directly to the back of the gpu, i put k thermocouple on the gpu when apllied hot air on the back ,gpu temp jump to 150 , you see each 4 mlcc have big short pcb connection underneath, that is directly conected to the gpu, very hard to soften the solder on that point the gpu suck all the heat so worred to melt the gpu balls or even kill the gpu chip,i think in the factory they flow the components first then finally the gpu chip, btw i have to do it with the soldering station with 500 c 40 watt, alot of flux, after done complete wash the whole card electronic circuit cleaner(the pcb have many tiny holes on the back of the gpu area, ithink it is for letting the heat escape not sure though, so i dont want to block those holes with flux.


yes is shunted already. but maximum drain for the graphic card only is 390 to 430 during timespy and portroyal .


----------



## cstkl1

dr.Rafi said:


> For me the frames is stuck on 60 /second even after disabling VS even in Nvidia control panel, and the game is running on direct x 11 mode and ray tracing not working asking me to run it in DX12 mode, not sure how to do that there is no menue otion for that and no lunch .exe for dx12.


if you are using any version
just use -dx12 in launch option


----------



## PhoenixMDA

dr.Rafi said:


> Was stressfull operation, mlcc are 1.25 x 2 mm itis easier to do it with soldering hot air, but its directly to the back of the gpu, i put k thermocouple on the gpu when apllied hot air on the back ,gpu temp jump to 150 , you see each 4 mlcc have big short pcb connection underneath, that is directly conected to the gpu, very hard to soften the solder on that point the gpu suck all the heat so worred to melt the gpu balls or even kill the gpu chip,i think in the factory they flow the components first then finally the gpu chip, btw i have to do it with the soldering station with 500 c 40 watt, alot of flux, after done complete wash the whole card electronic circuit cleaner(the pcb have many tiny holes on the back of the gpu area, ithink it is for letting the heat escape not sure though, so i dont want to block those holes with flux.
> 
> yes is shunted already. but maximum drain for the graphic card only is 390 to 430 during timespy and portroyal .


The GPU Chip you don´t destroy at temperature i think, the chip base on silzium.But the "solderd GPU points".... 
The Shuntmod with the 25mOhm i have done with my old Ersa MS6000(30year´s old).^^It was terrible to get evenly the temp on tip.
Today i get my new Ersa Nano Station, very nice, in 10s you can begin soldering, the Station hold the temp on the tip, and if it´s to less you can do an Offset of the Tip from 1-3
It´s like a overshoot of regulation for the tip, so you have anytime enough temp on the tip and you can soldering fast and great.
For little things like your MLCC i have bought a temperatur optimize 0,3mm tip.

Ersa "Industrie Standard" Station are great.


----------



## Larkonian

My Strix is a bit of a dud. Only barely does 20k graphics score:

https://www.3dmark.com/spy/14996971

Games run fine though and I will probably upgrade in a year or so anyway.


----------



## dr.Rafi

PhoenixMDA said:


> The GPU Chip you don´t destroy at temperature i think, the chip base on silzium.But the "solderd GPU points"....
> The Shuntmod with the 25mOhm i have done with my old Ersa MS6000(30year´s old).^^It was terrible to get evenly the temp on tip.
> Today i get my new Ersa Nano Station, very nice, in 10s you can begin soldering, the Station hold the temp on the tip, and if it´s to less you can do an Offset of the Tip from 1-3
> It´s like a overshoot of regulation for the tip, so you have anytime enough temp on the tip and you can soldering fast and great.
> For little things like your MLCC i have bought a temperatur optimize 0,3mm tip.
> 
> Ersa "Industrie Standard" Station are great.


Can you please give the new Ersa Model number you have.
Thanks


----------



## daveleebond

Anyone try out the new 3DMark DXR test with the 3080 and what results?


----------



## dr.Rafi

I need to figure something, my average and max core clocks in time spy are higher than many others cpu score also higher but total graphic score less not sure what is wrong .


----------



## Mad Pistol

daveleebond said:


> Anyone try out the new 3DMark DXR test with the 3080 and what results?


Yep. 3080 FE gets roughly 50 FPS.


----------



## PhoenixMDA

daveleebond said:


> Anyone try out the new 3DMark DXR test with the 3080 and what results?


The new Test has no heavy Load.


----------



## PhoenixMDA

dr.Rafi said:


> I need to figure something, my average and max core clocks in time spy are higher than many others cpu score also higher but total graphic score less not sure what is wrong .


That gives a higher Score like i said.









And here is the Station i have buy, dont let fool from the size 80W continous 150W max., 9s to heat up and you can set an Offset on the tip and with ESD.
This belong to Industry standard Serie and really enough for "normal" use case like Shunt Mod, MLCC and so on.
*0IC1200A-LD -
here is also an tip for really small thing, a warm optimize tip
0102PDLF03 -

If you solder very often and you want really professional i would say this one, but much more expansive 500,-,
Ersa industry premium with 2 channel, both have the i-tool soldering iron but this one with 150W continous,
so high watt normaly you dont need, but for this station it give soldering pliers, perfect for things like your MLCC, if you want to do it really easy.
But with the soldering pliers it coast 799,-
0IC2200VC -

The Nano Station i Think is enough and the price is really good for that.


----------



## DirtyScrubz

Larkonian said:


> My Strix is a bit of a dud. Only barely does 20k graphics score:
> 
> https://www.3dmark.com/spy/14996971
> 
> Games run fine though and I will probably upgrade in a year or so anyway.


How is 2.1ghz a dud?


----------



## ssgwright

DirtyScrubz said:


> How is 2.1ghz a dud?


I know.. lol


----------



## dr.Rafi

PhoenixMDA said:


> That gives a higher Score like i said.
> View attachment 2464322
> 
> 
> And here is the Station i have buy, dont let fool from the size 80W continous 150W max., 9s to heat up and you can set an Offset on the tip and with ESD.
> This belong to Industry standard Serie and really enough for "normal" use case like Shunt Mod, MLCC and so on.
> *0IC1200A-LD -
> here is also an tip for really small thing, a warm optimize tip
> 0102PDLF03 -
> 
> If you solder very often and you want really professional i would say this one, but much more expansive 500,-,
> Ersa industry premium with 2 channel, both have the i-tool soldering iron but this one with 150W continous,
> so high watt normaly you dont need, but for this station it give soldering pliers, perfect for things like your MLCC, if you want to do it really easy.
> But with the soldering pliers it coast 799,-
> 0IC2200VC -
> 
> The Nano Station i Think is enough and the price is really good for that.





PhoenixMDA said:


> That gives a higher Score like i said.
> View attachment 2464322
> 
> 
> And here is the Station i have buy, dont let fool from the size 80W continous 150W max., 9s to heat up and you can set an Offset on the tip and with ESD.
> This belong to Industry standard Serie and really enough for "normal" use case like Shunt Mod, MLCC and so on.
> *0IC1200A-LD -
> here is also an tip for really small thing, a warm optimize tip
> 0102PDLF03 -
> 
> If you solder very often and you want really professional i would say this one, but much more expansive 500,-,
> Ersa industry premium with 2 channel, both have the i-tool soldering iron but this one with 150W continous,
> so high watt normaly you dont need, but for this station it give soldering pliers, perfect for things like your MLCC, if you want to do it really easy.
> But with the soldering pliers it coast 799,-
> 0IC2200VC -
> 
> The Nano Station i Think is enough and the price is really good for that.


Idid those setting fixing already, but i figured time spy not accurate sometimes higher average but less graphic score sometimes less clock but higher scors, iam trying to pass 20000 graphic and thank you for soldering info.


----------



## PhoenixMDA

If your Memspeed to high the points also go down by me its ok to +1400, by +1450 i have lower score.


----------



## Greg1969

Hi guys
I need advice. I have possibility to get Aorus Xtream 3080 or ROG strix OC 3080. What would be the better choice?


----------



## Nizzen

Best in what?
Strix for highest overclock, and Aorus xtream for best cooling ( if the case gives the card enough fresh air)

My choice is Strix 10/10 times


----------



## Greg1969

Better choice in reference to build quality, cooling, OC, support, performance, looks etc. Price difference is negligible.


----------



## dr.Rafi

PhoenixMDA said:


> If your Memspeed to high the points also go down by me its ok to +1400, by +1450 i have lower score.


I tried all memory speed too , i think is something hidden in 3dmark or the motherbard to card interface ( data transfer between gpu and motherboard is not good as others who have better score, even the gpu is clocking high but not receiving as fast data as others, even is pcie 16 running mode but not effeceint enough, the last thing might be certain motherboard bios setting that need change but i dont know.
Check this test mine


http://www.3dmark.com/spy/15004119


(notice the max and average core clock and memory,i try already to lower memory but same results )
compare to this one https://www.3dmark.com/spy/14996971
from Larkonian
even i did DDU and install same nvidia driver version.
even tried 4 x8 memory,
strip windows,
this is the best ido now 


https://www.3dmark.com/spy/15010958


----------



## PhoenixMDA

With this mem clock i also not reach such a score^^


----------



## PhoenixMDA

PhoenixMDA said:


> With this mem clock i also not reach such a score^^


Here this was my best Score
https://www.3dmark.com/spy/14952738
And this is with +180GPU and +1000mem
https://www.3dmark.com/spy/15025133


----------



## rjrusek

dr.Rafi said:


> The Ventus passed the surgery and recovered fast and now doing better.
> 
> 
> https://www.3dmark.com/spy/14962146
> 
> 
> View attachment 2464262


Would love to see pics..


----------



## dr.Rafi

PhoenixMDA said:


> Here this was my best Score
> https://www.3dmark.com/spy/14952738
> And this is with +180GPU and +1000mem
> https://www.3dmark.com/spy/15025133


I found the problem, iam using cpu water block on gpu and heat sinks on memory and power stages, the memory is overheating and throutle down because the small heat sinks is not covering them properly, i used tornado server 120 fan 5000rpm on the graphic card now iam not getting higher scores but its consistant between tests, my sweet spot on memory is +800 in after burner does not scale over that when i use 1375 like yours my score drop alot . and my be its not cooling problem only but yours is better memory chips.
I will Try later today put the stock cooler back on which i never test the card with, the first day i got the card iused waterblock on gpu.


----------



## dr.Rafi

Yes That was the problem back to air cooling and stock cooler final score , even i didnt push to overclock to crush level , The gddr6 need proper cooling and when it overheat it throutle and bring unconsistant results with each test run but the stock cooling is keeping the vedio memory cool enough not to throutle under tests.


https://www.3dmark.com/spy/15030476


vedio memory @ +1145 in After burner
Number 1 for similar system


----------



## PhoenixMDA

Im waiting for my waterblock...befor 3weeks he had to arrived


----------



## dr.Rafi

PhoenixMDA said:


> Im waiting for my waterblock...befor 3weeks he had to arrived


I never buy preorder because i hate waiting , i want to get proper waterblock but for ventus either corsair (which i never tried one before and i dont know how they perform, or bits power (which dont have metal cover for the inductors only powermosfets, and the inductors really get hot during load both are preorder .hope EK make one for ventus .


----------



## dr.Rafi

there is this company but also no capacitor or inductor cooling. 








41.9US $ |Watercooler 3080 3090 3x 10g Cover Water Blk N-ms3090ves-x - AliExpress


Smarter Shopping, Better Living! Aliexpress.com




www.aliexpress.com


----------



## dev1ance

MSI 3080 Gaming X Trio flashed with Strix BIOs:


https://www.3dmark.com/spy/15029626



Can't really push core more than +130 even with max PL


----------



## dr.Rafi

dev1ance said:


> MSI 3080 Gaming X Trio flashed with Strix BIOs:
> 
> 
> https://www.3dmark.com/spy/15029626
> 
> 
> 
> Can't really push core more than +130 even with max PL


Try first max core voltage, If you maxing power around 400 to 450 watt during test then, Shunting is the only way to Roma my friend.


----------



## dev1ance

dr.Rafi said:


> Try first max core voltage, If you maxing power around 400 to 450 watt during test then, Shunting is the only way to Roma my friend.


I can't seem to change the voltage?


----------



## dr.Rafi

dev1ance said:


> I can't seem to change the voltage?


Latest beta Msi after burner







in setting teck control voltage and display voltage, it wont change the voltage by changing the slider but will feed higher voltage to the core with same boost clock , and you dont need to increase the core clock after maxing voltage , it will boost automatically the clock higher bit if you get crash then down clock by 5 increment and try again .
With my ventus msi the best bios for me is evga 450 , or Aorus master , Asus strix give me less performance. i think because Asus power stages chips are very different to other AIP cards .


----------



## Rik_IV

What 3080 would you guys prefer to keep considering flashing bios and potentially better PCB. The MSI Gaming X Trio 3080 or the ASUS TUF OC? I am really unsure because I see people successfully flashing their MSI to a more powerful BIOS but technically the TUF has more power stages and dual BIOS among other things.

Which one would you guys suggest and why?


----------



## VPII

Rik_IV said:


> What 3080 would you guys prefer to keep considering flashing bios and potentially better PCB. The MSI Gaming X Trio 3080 or the ASUS TUF OC? I am really unsure because I see people successfully flashing their MSI to a more powerful BIOS but technically the TUF has more power stages and dual BIOS among other things.
> 
> Which one would you guys suggest and why?


The MSI Gaming X Trio takes 3 x 8pin power which is why it can be flashed with say the 450 watt Strix bios. Same cannot be said for the Tuff or Tuff OC as they only take 2 x 8 pin power. No matter the power stages, they are in any event pretty much over designed in most cases.... I say most cases as there are some questionable ones.


----------



## DStealth

O yeaa
View attachment 2464472


----------



## Valdemar

DStealth said:


> O yeaa
> View attachment 2464472
> 
> View attachment 2464490


Congratulations on your purchase! I'm also going to take it. I will be humanly grateful to you, from the bottom of my heart! If you can test your 3080 EVGA in the toughest conditions. That is, if you specially create "difficult conditions" in your closed building without purging - If not difficult of course. And to test it under such conditions in a constant load with a maximum consumption at least there for about an hour (380 - 420 W) METRO is admissible - To understand what it is by default "out of the box" without any manipulation !!!! How many revolutions will there be and what temperatures. So that I roughly understand what to expect and what to choose !!!! This is very important to me now and as soon as possible, if possible! Thank you in advance and hope for your understanding from the bottom of my heart


----------



## Reinhardovich773

Hi guys! I just wanted to ask whether it'd be worth it to flash a Palit GameRock OC VBIOS into my Palit 3080 GamingPro OC. The former uses 3 x 8-pin power connectors, has a 440 W power limit and employs a custom PCB, whereas the latter only has 2 x 8-pin power connectors, has a 350 W power limit and employs a reference PCB. I did see that some people concluded that flashing VBIOSes from higher-end models into lower-end models doesn't yield much but if someone here has been able to extract more performance out of a severely power limited card, i'd really love to hear about such an experience if possible. Thanks in advance for any potential reply!


----------



## Alemancio

Reinhardovich773 said:


> Hi guys! I just wanted to ask whether it'd be worth it to flash a Palit GameRock OC VBIOS into my Palit 3080 GamingPro OC. The former uses 3 x 8-pin power connectors, has a 440 W power limit and employs a custom PCB, whereas the latter only has 2 x 8-pin power connectors, has a 350 W power limit and employs a reference PCB. I did see that some people concluded that flashing VBIOSes from higher-end models into lower-end models doesn't yield much but if someone here has been able to extract more performance out of a severely power limited card, i'd really love to hear about such an experience if possible. Thanks in advance for any potential reply!


Have you tried undervolting? Many of us run 2,025mhz at 0.9~0.925V and rarely hit powerlimit.


----------



## DStealth

18866
justa waking around


----------



## Battler624

So far I'm only doing undervolting + OC + max fans to get the maximum results I can but thats still not enough, Is there a way to unlock or increase the powerlimit of the 2*8pin Ventus 3X?


----------



## dr.Rafi

Battler624 said:


> So far I'm only doing undervolting + OC + max fans to get the maximum results I can but thats still not enough, Is there a way to unlock or increase the powerlimit of the 2*8pin Ventus 3X?


check my journey on this thread and my posts.


----------



## DirtyScrubz

Is there any good waterblocks for Strix 3080 out right now? I checked EK and it seems their block releases late this month so it’ll probably be awhile till shops like PPC have it.


----------



## Battler624

dr.Rafi said:


> check my journey on this thread and my posts.


Been readin your journey for the past 10 minutes and so far it seems on the ventus atleast you are using the aorus master bios? So far how are you liking it?


----------



## Mucho

DirtyScrubz said:


> Is there any good waterblocks for Strix 3080 out right now? I checked EK and it seems their block releases late this month so it’ll probably be awhile till shops like PPC have it.


Alphacool Eisblock Wasserkühler für ASUS ROG Strix RTX 3080/3090! | Nvidia Fullsize | GPU Water Cooler | Shop | Alphacool - the cooling company

Bykski GPU Wasser Kühl Block Für ASUS RTX3080 3090 STRIX, Grafikkarte Flüssigkeit Kühler System, RTX 3080 3090, N AS3090STRIX X|Lüfter & Kühlung| - AliExpress


----------



## Reinhardovich773

Alemancio said:


> Have you tried undervolting? Many of us run 2,025mhz at 0.9~0.925V and rarely hit powerlimit.


Hi there! I did try setting a custom fan curve in Afterburner beta 3 (latest beta) but sadly my card would not follow anything i'd set. The GPU clock speeds would just vary by game and by how much the card is stressed, but in general i get anywhere from 1860 MHz to 2115 MHz, with clocks most of the time in the 1935-1980 MHz region. I just cannot control the voltage like i used to with my 2070 as this Palit 3080 just runs into power limits all the time even below 1 V in terms of VCore. And BTW this is with a +140 MHz core overclock and a +850 MHz memory overclock.


----------



## Battler624

dr.Rafi said:


> check my journey on this thread and my posts.





Battler624 said:


> Been readin your journey for the past 10 minutes and so far it seems on the ventus atleast you are using the aorus master bios? So far how are you liking it?


So I tried the aorus master bios, nope it ain't working on my 3080. Black screen, good thing I had a backup GPU so I was able to recover.


----------



## Falkentyne

Battler624 said:


> So I tried the aorus master bios, nope it ain't working on my 3080. Black screen, good thing I had a backup GPU so I was able to recover.


Black screen means your displayport port got disabled. All you had to do was switch to another displayport (or at least try HDMI first).


----------



## Battler624

Falkentyne said:


> Black screen means your displayport port got disabled. All you had to do was switch to another displayport (or at least try HDMI first).


Nah i'm pretty sure that bios was full of issues, because running gpu-z shows the bios version as unknown *unless this is normal?*. anyway I tried the asus TUF one and it showed up normally in gpu-z so I do think its a problem from the gigabyte bios.

The bios in question is Gigabyte RTX 3080 VBIOS.


Also Dr.rafi how are you getting your numbers? I got the tuf bios working but it is still not drawing more power (capped at 320W even tho bios shows and afterburner shows 117% powerlimit)


----------



## Falkentyne

Battler624 said:


> Nah i'm pretty sure that bios was full of issues, because running gpu-z shows the bios version as unknown *unless this is normal?*. anyway I tried the asus TUF one and it showed up normally in gpu-z so I do think its a problem from the gigabyte bios.
> 
> The bios in question is Gigabyte RTX 3080 VBIOS.
> 
> 
> Also Dr.rafi how are you getting your numbers? I got the tuf bios working but it is still not drawing more power (capped at 320W even tho bios shows and afterburner shows 117% powerlimit)


Oh you're right, then. Sorry. Surprised the flash failed.


----------



## Battler624

Falkentyne said:


> Oh you're right, then. Sorry. Surprised the flash failed.


Well I have news, so yea the bios i linked above is definitely bricked because the older build one works fine, need to contact techpowerup somehow and inform them about this.

Anyway, Its weird tho, even tho I have GPU-Z saying default power is 370W it seems that the card is capped at around 320W


----------



## Killmassacre

I got a 3080 trio a while ago and I just got around to OC'ing it, but I was only able to get graphics score of 19054 in timespy and 12256 in Port Royal. Does this seem average or below average for a trio? This is with +140 core clock, +500 memory, stock voltage. I kinda wished I could have gotten a FTW3.



https://www.3dmark.com/spy/15051049




https://www.3dmark.com/pr/473316


----------



## Battler624

Killmassacre said:


> I got a 3080 trio a while ago and I just got around to OC'ing it, but I was only able to get graphics score of 19054 in timespy and 12256 in Port Royal. Does this seem average or below average for a trio? This is with +140 core clock, +500 memory, stock voltage. I kinda wished I could have gotten a FTW3.
> 
> 
> 
> https://www.3dmark.com/spy/15051049
> 
> 
> 
> 
> https://www.3dmark.com/pr/473316


19K+ is great


----------



## dev1ance

Killmassacre said:


> ave gotten a FTW3.


Bruh, those stock scores are great. You have to flash BIOs to OC to get like +500 more in Timespy. Not really worth it for everyday use.


----------



## dr.Rafi

Most reviewers shows 5900x cpu better than 10900k but nothing imppressive cpu scores yet.!!!


----------



## dr.Rafi

Battler624 said:


> Been readin your journey for the past 10 minutes and so far it seems on the ventus atleast you are using the aorus master bios? So far how are you liking it?


Great but each bios act differently on your card, so you have to be patient with each new bios you try before you judge is not doing well, and your card maximum core clock with each bios mean nothing , you have to compare perfomance FPS in benchs rather than the core clock, I use furemark on 4k and 8x antialiasing for quick test to find if i can get more fps


----------



## dr.Rafi

Battler624 said:


> So I tried the aorus master bios, nope it ain't working on my 3080. Black screen, good thing I had a backup GPU so I was able to recover.


Black Screen not always mean the bios isnot working but some times some of the disply output go off with different bios , no such thing happen to me with ventus but even if happen i try to plug the display in different output of the card and it works . you best try 3 pins high wattage bioses lik FTW3 ultra 450 watt, Asus strix, Aorus extreme, 
Msi trio is great card i passed 20000 with ventus but i seen your score page your memory system is running @ 2666 thats extremely low when you campare your card score to other check everthing ,what temprature their graphic , nvidia driver version , windows version , graphic memory clocks , cpu scores, and you should have fresh installed windows ,windows that have dozen of softwares installed and uninstaaled all affect performance, and be sure to have performance mode enabled in windows, close all running softwares in background, 
if you dont want to have all this headach compare portroyal scores to others ,portroyal test is 98% gpu dependent and will give you much clear idea how your card is doing.
Good luck


----------



## PhoenixMDA

Battler624 said:


> Nah i'm pretty sure that bios was full of issues, because running gpu-z shows the bios version as unknown *unless this is normal?*. anyway I tried the asus TUF one and it showed up normally in gpu-z so I do think its a problem from the gigabyte bios.
> 
> The bios in question is Gigabyte RTX 3080 VBIOS.
> 
> 
> Also Dr.rafi how are you getting your numbers? I got the tuf bios working but it is still not drawing more power (capped at 320W even tho bios shows and afterburner shows 117% powerlimit)


Yes i think because look at the DP and HDMI Port´s of the Gigabyte Master, if you want to try take th Bios from the Gigabyte Vision there are 2 Port´s th same, by the Master no one.By me was the same with th Master Bios, the Vision Bios i dont have tested.


----------



## ssgwright

I'm using liquid metal for a shunt mod but I'm worried... last time I did this the resistors fell off my 2080 after awhile lol. I didn't know liquid metal breaks down the solder.


----------



## Stash

I was unfortunate enough to only have access to a Gaming Trio X when the cards dropped, reading up on the shocking design decisions re: 3x8 pins with only 12 power phases and a grim 350w TDP...

Is there any hope or should I just flip it and buy a good card?


----------



## dr.Rafi

Battler624 said:


> Nah i'm pretty sure that bios was full of issues, because running gpu-z shows the bios version as unknown *unless this is normal?*. anyway I tried the asus TUF one and it showed up normally in gpu-z so I do think its a problem from the gigabyte bios.
> 
> The bios in question is Gigabyte RTX 3080 VBIOS.
> 
> 
> Also Dr.rafi how are you getting your numbers? I got the tuf bios working but it is still not drawing more power (capped at 320W even tho bios shows and afterburner shows 117% powerlimit)


Shunt mod


----------



## dr.Rafi

Battler624 said:


> Well I have news, so yea the bios i linked above is definitely bricked because the older build one works fine, need to contact techpowerup somehow and inform them about this.
> 
> Anyway, Its weird tho, even tho I have GPU-Z saying default power is 370W it seems that the card is capped at around 320W


Welcome to Nvidia stealing overclockers joy one step more with every release.


----------



## Rik_IV

So I have been testing my MSI Gaming X Trio a bit and the result seem a bit odd maybe you guys could shed some light.
Very first test-stock settings: 17802 GPU score
Second test-power limit 102%: 18332 GPU score
Third test-PL102%, +50 core clock +500: 18664 GPU score
Fourth test-PL102%, +80 core clock +500: 18676 GPU score
Fifth test-PL102%, +140 core clock +500: 18870 GPU score

First benchmark seems really low for starters. Second seem a bit more normal, the gains later seem so insignificant. Is this just a bad chip or is this insufficient testing of just one type(timespy) benchmark?


----------



## DStealth

TimeSpy becomes CPU bottlenecked
Test PortRoyal or TimeSpy extreme to reduce the CPU/RAM influence .

Edit: Just looked your results again...probably powerlimited. If you use stock BIOS 340/50w are very good for achieving ~18.7k
There are users flashing StrixOC or Evga XOC BIOS on Gaming X Trio to extend the limits to 450w


----------



## Rik_IV

DStealth said:


> TimeSpy becomes CPU bottlenecked
> Test PortRoyal or TimeSpy extreme to reduce the CPU/RAM influence .



Interesting... I'm running a 10700k at 5.2Ghz at all cores / 3200Mhz CL16(HT disabled). Ill try those as well.


----------



## DStealth

Yes for sure Powerlimited with +50 and +140 you actually has a few more spikes higher, but not real 90mhz that's why the score is very similar
You can observe this in you 3dmark scores looking the "Average clock frequency" for these runs.


----------



## Rik_IV

DStealth said:


> Yes for sure Powerlimited with +50 and +140 you actually has a few more spikes higher, but not real 90mhz that's why the score is very similar
> You can observe this in you 3dmark scores looking the "Average clock frequency" for these runs.



That makes sense! I knew that average thingy. I literally just bought 3dmark moments ago, I was running the demo before. Honestly I am to scared to brick my GPU by flashing to a the strix BIOS.
PL102%, +140 core clock +500: 12199 Port royal
PL102%, +160 core clock +500: 12257 Port royal
PL102%, +160 core clock +500: 19002 Time Spy
I guess that's decent right?

I will see how far upping the core clock will increase my score. Then do the same on a TUF OC later next week.

Update: +160 seems about the max I can go on the TRIO.


----------



## DStealth

Not bad but best 3*8pin cards are close to 13k in Portroyal and exceeding 20k GPU in Timespy. You need either shuntmod or a BIOS flash to improve the scores.


----------



## cstkl1

project cars 3 and then this. all fugly but asking alot. does it takd nfs hot pursuit 2 remastered to break 2020 fugly race games??


----------



## Reinhardovich773

cstkl1 said:


> project cars 3 and then this. all fugly but asking alot. does it takd nfs hot pursuit 2 remastered to break 2020 fugly race games??


After the marvel that is Forza Horizon 4 (in terms of both looks and optimisation), every racing game now looks terrible to me haha!


----------



## cstkl1

Reinhardovich773 said:


> After the marvel that is Forza Horizon 4 (in terms of both looks and optimisation), every racing game now looks terrible to me haha!


true.. 
but whats actually ridiculous is.. 
a 2002 remastered nfs hot pursuit 2 (based on heat) is gonna look way better and perform better.


----------



## pompss

anyone able to reach 2150 mhz ? strix 3080 im at 2130 mhz. 105 power limit .how to unlock more power limit ?


----------



## Alemancio

What are some nice Milestones to determine that my card is a good bin or a dud? (EVGA FTW3 w/ 450W BIOS)

So far I've only been able to reach (100% stable (3dmark Stress Test pass):

Max OC: 2085Mhz @ 118%
Daily OC: 2025Mhz @ 0.950v, 110%

I think its pretty average, no?


----------



## Wries

3080 *FE *owners here? I want to checkup on a weird issue that might just be me unlucky.

The back/top fan is weird. It is, if I squint real hard, a bit warped actually. It also features a squeaky noise that I for a while thought was some kind of coil whine until I realised where it came from. The sound wasn't very prominent and honestly I don't really care for small sounds that much, but realising it was an actual fan squeaking made me actively worried.

Now digitalriver (nvidia.com's seller) doesn't have support staff that understand english fully. They read phrases and words and copy/paste responses based on that. So I never got it clarified from them what they'd do if I RMAd the card. Weird times but I use it for work and don't want to simply get my money back and have to search for another non-existing compact 3080 for a higher price.

So I did what any stupid person would do, unscrewed the fan by the four screws accessible through the fan blades, aimed carefully and sprayed some mineral oil in it. Voila, squeak seems to be gone. I don't recommend doing this for obvious reasons but hey.

Now I'm still wondering.. how well are these fans with withstanding heat actually? I've had mine since late sept which is longer than most. Wonder if it was just a fluke or if more of these will start squeaking soon enough.

Can any owner confirm or deny for me that, with default fan curves, does Fan #2 run a couple % lower in speed compared to Fan #1?
Found confirmation that this is normal  Nvidia RTX 3080 Founders Edition - LanOC Reviews.


----------



## Battler624

dr.Rafi said:


> Shunt mod


never modded hardware, maybe if i can get outside help i'll do it.

But quick question, did you both do a shunt mod and change the bios?


----------



## dr.Rafi

Battler624 said:


> never modded hardware, maybe if i can get outside help i'll do it.
> 
> But quick question, did you both do a shunt mod and change the bios?


If you shunt mod you dont need to change the bios even because the difference will be minimal,if you worry about warrenty you can put a solder paste on top the the existing shunt connections and put the new shunt resistor on top and scure by any tape and the use a piece of foam that come with pc parts box cut the foam so it can keep a bit of pressure on the shunt pressing by the heat sink , because the tape only might get loose after awhile , when you want remove it simply clean the paste with cotton pad , for me personaly i did real soldering, i live in Australia, if my card hold shunting for for 2 years of use , iam very carfull not to torture the card with high temprature , 99 % of peoples here in Australia dont know about shunting , and never expect someone take a risk to do such athing and i do nice clean desoldering nearly factory looking, so then sell on ebay and i never had issue with any costumer


----------



## dr.Rafi

Battler624 said:


> never modded hardware, maybe if i can get outside help i'll do it.
> 
> But quick question, did you both do a shunt mod and change the bios?


Oops but wait you got trio right ,that card is made to pull 520 max watt (150 watt x 3 8pins + 70 watt of pcie )you dont need shunting your card can simply modded ,try Evga450 wat bios , it must work


----------



## Colonel_Klinck

Hey all, finally got my ASUS TUF OC installed. Current bios is below. Is that the highest power limit one out there, is there higher? I can't actually see this bios on Techpowerup.


----------



## Rik_IV

Colonel_Klinck said:


> Hey all, finally got my ASUS TUF OC installed. Current bios is below. Is that the highest power limit one out there, is there higher? I can't actually see this bios on Techpowerup.


What do you mean its right there. Asus RTX 3080 VBIOS

As for a higher power limit one for the TUF OC. To my knowledge... since its 2x8pin there is no bios with a higher power limit for the TUF OC.


----------



## Colonel_Klinck

Rik_IV said:


> What do you mean its right there. Asus RTX 3080 VBIOS
> 
> As for a higher power limit one for the TUF OC. To my knowledge... since its 2x8pin there is no bios with a higher power limit for the TUF OC.


Doh couldn't see it even though it was indeed there. I've missed the last 25 pages or so and had hoped a higher power limit bios might have appeared for 2 pin cards. Oh well we'll have to wait I guess.


----------



## dr.Rafi

Colonel_Klinck said:


> Doh couldn't see it even though it was indeed there. I've missed the last 25 pages or so and had hoped a higher power limit bios might have appeared for 2 pin cards. Oh well we'll have to wait I guess.


You dont need to go through all pages simply choose search on of the page for rtx 3080 only and choose brand asus only then you will have only one page to look at.


----------



## _Killswitch_

Not much into Benchmarking, but decided to play with my 3080 today. Best "middle" ground on core/mem overclock I came up with my card. Stock 3080 Fe (no Bios flash) and yes I know my 8700K is probably limiting me, hoping maybe upgrading little later this year or early next.



https://www.3dmark.com/spy/15080117













https://www.3dmark.com/pr/477235


----------



## Stash

PerfCap VRel/VOp = PSU issue?


----------



## _Killswitch_

Stash said:


> PerfCap VRel/VOp = PSU issue?


Who?


----------



## dev1ance

VRel/VOp usually mean it's not power limited, likely could up your clocks a bit.


----------



## _Killswitch_

dev1ance said:


> VRel/VOp usually mean it's not power limited, likely could up your clocks a bit.


I could and i did but that was happy medium i picked as out of 30 something times i ran both benchmarks those was my two best scores with those setting. Was just playing around today as I don't run my 3080 Oc'ed 24/7 stock it runs all my games just fine, truly is a monster of a card.


----------



## cstkl1

Stash said:


> PerfCap VRel/VOp = PSU issue?


generally .. if u get both at sametime

example this










was psu issue.. rm750. changed psu. he was hitting 100% gpu utilization in all games.


----------



## bmgjet

VRel = Cards at the end of its voltage boost table so it lists voltage reliablity as performance cap.
VOp = Cards hitting the max allowed voltage so lists operating voltage as the performance cap.


----------



## cstkl1

bmgjet said:


> VRel = Cards at the end of its voltage boost table so it lists voltage reliablity as performance cap.
> VOp = Cards hitting the max allowed voltage so lists operating voltage as the performance cap.


you cannot see it as that.

you got to see
power draw, gpu utilization and voltage.
if gpu temp/tgp is low u will hit vrel for example..

common is having vrel or powerlimit not vrel and vop together.

anyway u guys can test this yourself out. already sorted 3 ppl with this issue at the shop. all 3 changed psu and now is either vrel or powerlimit.. which is norm.
not vrel and vop together.


----------



## Stash

Yes. VRel on its own is something I'm familiar with, but I had not seen VRel/VOp together before hence why I am leaning toward PSU.

I have an RM850x which should be sufficient, I'll pick up a spare Type 4->PCIe an see if running the 3x8 pins through 3 separate PSU cables will improve things -- although I find it unlikely as it all goes back to the same 12v rail but... we'll see.


----------



## dr.Rafi

Very interesting finding gigabyte 3080 master and extreme rated at 370 and 450 watt respectively , exact same pcb same power sensor and controller chips the only difference is third power rail with inductor and 2 extra shunts, and some small tiny mlccs and tiny resistors, iam thinking where is our engeneers who like moding to find how we can mod other cards with 2 x 8pins to 3x 8 pins and safe some cash , especially with aorus master,asus tuf , and even other pcbs .








extreme








master
Picture source: Review Captain
this channel but i tried to capture best frames and crop it
the black small plug in the end ,ithink is for powering the screen and rgb display in the retail shop.


----------



## DStealth

barely exceeding 19600 GPU in TS probably CPU limiting with FTW3 Ulra on stock cooler


https://www.3dmark.com/compare/spy/15084412/spy/15045844


In Firestike it holds 2160









12546 PortRoyal https://www.3dmark.com/pr/477857


----------



## dr.Rafi

DStealth said:


> barely exceeding 19600 GPU in TS probably CPU limiting with FTW3 Ulra on stock cooler
> 
> 
> https://www.3dmark.com/compare/spy/15084412/spy/15045844
> 
> 
> In Firestike it holds 2160
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 12546 PortRoyal https://www.3dmark.com/pr/477857


Time spy loves memory speed hard to get much higher scores even graphic with 3800 memory of ryzen 3


----------



## DStealth

Hah the memory is not slow @3840 1T - 63gb/s copy my best efforts @4800c14 with intel was a shy over 70...
19653 - https://www.3dmark.com/spy/15084817 with strixOC bios flashed.
The GPU is very good the momory of the card is the issue cannot get over 20gbps...even +300/400 are giving better scores than +500. My Palit 3080 can bench +1200 with rising scores


----------



## PhoenixMDA

@dr.Rafi 
On Asus Tuf it´s not posible, there are no place for a third connector.The 3080/3090 has the PCB design, the 3 conector PCB are only Strix.


----------



## asdkj1740

dr.Rafi said:


> Very interesting finding gigabyte 3080 master and extreme rated at 370 and 450 watt respectively , exact same pcb same power sensor and controller chips the only difference is third power rail with inductor and 2 extra shunts, and some small tiny mlccs and tiny resistors, iam thinking where is our engeneers who like moding to find how we can mod other cards with 2 x 8pins to 3x 8 pins and safe some cash , especially with aorus master,asus tuf , and even other pcbs .
> View attachment 2464613
> 
> extreme
> View attachment 2464614
> 
> master
> Picture source: Review Captain
> this channel but i tried to capture best frames and crop it
> the black small plug in the end ,ithink is for powering the screen and rgb display in the retail shop.


gigabyte has changed their 6 spcaps design back to 5+1mlcc array nvidia reference design on xtreme.
i remeber aorus master is still 6 spcaps design.
can 6sp caps design handle 450w? haha.

















aorus master


----------



## Colonel_Klinck

dr.Rafi said:


> You dont need to go through all pages simply choose search on of the page for rtx 3080 only and choose brand asus only then you will have only one page to look at.


Oh I did that and still missed it lol


----------



## Colonel_Klinck

Those of you in the UK, where would you buy shunts from? Thinking of giving it a go once my water block arrives.


----------



## cstkl1

ok now moment of truth.. is 5900x faster than 10900k ..
who is right so many ppl posting odd stuff

10900k - 4400C18 Dual Rank 1.4v PCIE 3.0
5900x - 3800C14 Dual rank 1.4v PCIE 4.0

First round 1080p RTX 3080 fixed at [email protected] | VRAM 21gbps

















winner Ryzen .. 204fps vs 197...

second round @1440p



















within margin of error.. TIE

now bump to 2100mhz FIXED @ 1080p


















Ryzen winner.

oh yeah 3200c14 on ryzen terrible btw. doesnt affect intel that much

both optimized rigs... which anybody can achieve...
heck ppl can even argue its pcie 4.0.. but a win is a win..

now got to wait for DF review on frametime.


----------



## PhoenixMDA

Colonel_Klinck said:


> Those of you in the UK, where would you buy shunts from? Thinking of giving it a go once my water block arrives.


Is the question what you want, 20%(25mOhm) do 4,166mohm oder 25%(20mOhm) do 4mOhm.
I have buy both from here, you need 2x4 because 6pcs.
4X WSL2512R0200FEA Widerstand: power metal Messung SMD 2512 20mΩ 1W ±1% 75ppm/°C | eBay


----------



## PhoenixMDA

@cstkl1
Nice Performance, the 5900X same fast as my 9900k.Is the RamOC with sub´s max. out?
Your 10900k is definitiv not max.

My 9900k 5,2/4,8/4x8GB 4400CL17-17 [email protected] 0,887V/1950Mhz [email protected],887V +500Mem


----------



## cstkl1

PhoenixMDA said:


> @cstkl1
> Nice Performance, the 5900X same fast as my 9900k.Is the RamOC with sub´s max. out?
> Your 10900k is definitiv not max.
> 
> My 9900k 5,2/4,8/4x8GB 4400CL17-17 [email protected] 0,887V/1950Mhz [email protected],887V +500Mem
> View attachment 2464633


yup.. its all tweaked.. in this case not MAXEd out but LOWED out lol..
1440p i think both are same regardless what astonishing is the impact of ram for 5900x
btw did u notice the massive drop on this current driver and the launch day driver .. launch day driver is 157/158fps..

10900k stock 4.9 vs 5.2ghz not much impact for this game on fps.. especially on that third part where gpu utilization was bad...

so 3200mhz 5900x with 3080 @2100 drops to 191 fps @1080p.. thats a 10% drop...from 211...

the 1080 one i could have made a mistake on the 10900k for..2100 cause i didnt check the via overlay like i did for 1950.. we put it at 1v.. so confirm zero throttling since both on water via voltage curve..... reconfirmed it with few runs...

it is a faster system bro.. we could atribute it to pcie 4.0 etc but regardless for ppl who game on 1080p.. its a option worth persuing...
5900x is fast out of the box.. cpu oc wont give much gaming performance. but ram oc on it easy 10%... for 1080p...


----------



## cstkl1

also something damn weird.. 
as u guys know i cursed my strix 3080
on air it sucked compared to my tuf 3080
went on water also wasnt impressed

and now suddenly its doing 2130-2145 fixed with boosting up to 2175...
all stable.. 

wth right.. did this gpu need some tender loving care or what.. 
first i thought it was the latest driver..
reverted day 1 driver.. its doing the same fps...


----------



## Falkentyne

cstkl1 said:


> you cannot see it as that.
> 
> you got to see
> power draw, gpu utilization and voltage.
> if gpu temp/tgp is low u will hit vrel for example..
> 
> common is having vrel or powerlimit not vrel and vop together.
> 
> anyway u guys can test this yourself out. already sorted 3 ppl with this issue at the shop. all 3 changed psu and now is either vrel or powerlimit.. which is norm.
> not vrel and vop together.


This is wrong.
You get vOP if you lock (L in MSI Afterburner V/F curve) the highest voltage point, like 1.081v-1.1v. People lock the voltage to prevent the card from downclocking in low usage games, to get higher FPS, but in most cases this is not necessary. May be useful for old benchmarks.

On some AMD cards, the card was failing to even enter 3D mode on some much older games because the GPU load is too low, which caused low FPS, so people had to write Clockblocker which would force the card into 3D mode. This is much easier to fix on Nvidia cards.



Stash said:


> Yes. VRel on its own is something I'm familiar with, but I had not seen VRel/VOp together before hence why I am leaning toward PSU.
> 
> I have an RM850x which should be sufficient, I'll pick up a spare Type 4->PCIe an see if running the 3x8 pins through 3 separate PSU cables will improve things -- although I find it unlikely as it all goes back to the same 12v rail but... we'll see.


If your card is shunt modded, you will get this at low GPU usage and low temps if you lock 1.1v. Might happen on stock cards too but more likely to only see this at idle instead. Once card starts reducing clocks at -15 mhz / +6C, only VREL will appear until power limit (PWR) throttling..


----------



## PhoenixMDA

@cstkl1 
My one is maxed out Memory OC for this Plattform, i think the new Zen3 perform very well and some sample can drive over 4000 1:1,
so it will go higher in Performance and will be faster then the Intel in TombRaider.


----------



## sorokyl

I have a 3080 FE. I have the EK waterblock on the way which I suspect will be pretty decent. I have 2x360 rads to share with 5900x.

I am not trying to set any records, just trying to have fun and unlock any headroom while allowing my card to operate well for ~4 years. Should I be happy with the 370w limit or should I consider shunt mod? Would it get me more than say another 5% perf?? Either way I plane to tune the frequency / voltage curve to squeeze as much out of each watt as I can.


----------



## dr.Rafi

PhoenixMDA said:


> @dr.Rafi
> On Asus Tuf it´s not posible, there are no place for a third connector.The 3080/3090 has the PCB design, the 3 conector PCB are only Strix.


I meant moding and adding new connector like zombie KINGPIN modding .


----------



## dr.Rafi

asdkj1740 said:


> gigabyte has changed their 6 spcaps design back to 5+1mlcc array nvidia reference design on xtreme.
> i remeber aorus master is still 6 spcaps design.
> can 6sp caps design handle 450w? haha.
> 
> 
> 
> 
> 
> 
> View attachment 2464617
> 
> 
> 
> 
> aorus master
> View attachment 2464618


sp cap or mlcc has nothing to do with power , they keep the gpu sable @high frequency if the silicon quality is less than average, i had ventus stable woring 580 watt for weeks by shunt modding.


----------



## dr.Rafi

cstkl1 said:


> also something damn weird..
> as u guys know i cursed my strix 3080
> on air it sucked compared to my tuf 3080
> went on water also wasnt impressed
> 
> and now suddenly its doing 2130-2145 fixed with boosting up to 2175...
> all stable..
> 
> wth right.. did this gpu need some tender loving care or what..
> first i thought it was the latest driver..
> reverted day 1 driver.. its doing the same fps...


some times finger prints on pci connector cause issues or swapping the power 8 pin connectors ,


----------



## PhoenixMDA

dr.Rafi said:


> I meant moding and adding new connector like zombie KINGPIN modding .


I know but so looks the Tuf PCB...


----------



## asdkj1740

dr.Rafi said:


> sp cap or mlcc has nothing to do with power , they keep the gpu sable @high frequency if the silicon quality is less than average, i had ventus stable woring 580 watt for weeks by shunt modding.


i know, i just wonder why gigabyte eventually changed their design on xtreme only rather than on master too.

master is extremely poor priced.


----------



## dr.Rafi

PhoenixMDA said:


> I know but so looks the Tuf PCB...
> View attachment 2464665


Really Asus tuf should have four 4x 8pin not even 3 ,is shame what Asus done with all that rich power stages and limit the card with only 2 connectors.
Even i noticed even with shunt modding , the 2 pin graphic cards are getting limited by Vrel kicking in gpuz, the 12 volt rails are dropping to 11.80 when loaded with high current which is limiting some of the overclockability of the card especially when gpu is running in low voltage 0.74 pulling current is high, and cause the 12 volt drop across rails.


----------



## Frohagen

Managed to buy an Asus RTX 3080 TUF OC directly from Asus' online store around October 27. Looks like I got the only bad egg from them that I can find online. Everyone reported amazing temps from this GPU but mine sometimes maxes out at 86 degrees playing Escape from Tarkov (or Witcher 3, Watchdogs Legion, Path of Exile, etc).

Here are some screenshots of my GPU-Z, Asus Tweak II, MSI Dragon (system/cpu temps), and precision temps for good measure.


http://imgur.com/a/qv6hxHl


I have a Fractal Meshify C with two nice intake fans and one rear and one top exhaust. I tried removing the side panel and the front mesh filter but it made literally zero difference in the temp. The GPU idles at 38 to 43 degrees depending on what mode I place it in in Tweak II. I have never had this issue with any GPU before.

Seems karma decided it should offset my luck getting a 3080 by sending me one that performs poorly.


----------



## dr.Rafi

asdkj1740 said:


> i know, i just wonder why gigabyte eventually changed their design on xtreme only rather than on master too.
> 
> master is extremely poor priced.


Marketing ,and they try to increase the gap between both so justify to find customers for extreme especially after mlcc drama , but if nobody talked about mlcc they will never changed ,the only difference of extreme compare to master is extra power connector and a toy inside the box.
Is much cheaper for companies to design one pcb and one cooler , so they made it best they can do but over priced the extreme and sell the master with high profet too because they are both over mrcp price.


----------



## Falkentyne

Frohagen said:


> Managed to buy an Asus RTX 3080 TUF OC directly from Asus' online store around October 27. Looks like I got the only bad egg from them that I can find online. Everyone reported amazing temps from this GPU but mine sometimes maxes out at 86 degrees playing Escape from Tarkov (or Witcher 3, Watchdogs Legion, Path of Exile, etc).
> 
> Here are some screenshots of my GPU-Z, Asus Tweak II, MSI Dragon (system/cpu temps), and precision temps for good measure.
> 
> 
> http://imgur.com/a/qv6hxHl
> 
> 
> I have a Fractal Meshify C with two nice intake fans and one rear and one top exhaust. I tried removing the side panel and the front mesh filter but it made literally zero difference in the temp. The GPU idles at 38 to 43 degrees depending on what mode I place it in in Tweak II. I have never had this issue with any GPU before.
> 
> Seems karma decided it should offset my luck getting a 3080 by sending me one that performs poorly.


Just repaste it. Use a thick paste like Kryonaut Extreme or Thermalright TFX. If you have Kryonaut already, you can use that as well. These GPU's suck power.


----------



## dr.Rafi

Frohagen said:


> Managed to buy an Asus RTX 3080 TUF OC directly from Asus' online store around October 27. Looks like I got the only bad egg from them that I can find online. Everyone reported amazing temps from this GPU but mine sometimes maxes out at 86 degrees playing Escape from Tarkov (or Witcher 3, Watchdogs Legion, Path of Exile, etc).
> 
> Here are some screenshots of my GPU-Z, Asus Tweak II, MSI Dragon (system/cpu temps), and precision temps for good measure.
> 
> 
> http://imgur.com/a/qv6hxHl
> 
> 
> I have a Fractal Meshify C with two nice intake fans and one rear and one top exhaust. I tried removing the side panel and the front mesh filter but it made literally zero difference in the temp. The GPU idles at 38 to 43 degrees depending on what mode I place it in in Tweak II. I have never had this issue with any GPU before.
> 
> Seems karma decided it should offset my luck getting a 3080 by sending me one that performs poorly.


try to re apply tne thermal paste and make sure your heatsink is setting properly on the card .


----------



## duppex

I really hope that this is not officially MSI selling RTX 3080 on Amazon for £1399.99. the sad thing is there were 6 cards available today now only 5 so people I actually buying them at this ridiculous price😤😤😤

MSI GeForce RTX 3080 Gaming X Trio: Amazon.co.uk: Computers & Accessories


----------



## omarrana

dr.Rafi said:


> Latest beta Msi after burner
> View attachment 2464419
> in setting teck control voltage and display voltage, it wont change the voltage by changing the slider but will feed higher voltage to the core with same boost clock , and you dont need to increase the core clock after maxing voltage , it will boost automatically the clock higher bit if you get crash then down clock by 5 increment and try again .
> With my ventus msi the best bios for me is evga 450 , or Aorus master , Asus strix give me less performance. i think because Asus power stages chips are very different to other AIP cards .


hi i also have msi ventus 3x non oc version. Did you flash evga bios? is it worth it? how much performance gain you got. Thankyou!


----------



## dr.Rafi

omarrana said:


> hi i also have msi ventus 3x non oc version. Did you flash evga bios? is it worth it? how much performance gain you got. Thankyou!


shunt mod only give you more perfomance ,evga bios give extra 50 points in time spy but wont increase performance unless you do shunt mode first.


----------



## omarrana

dr.Rafi said:


> shunt mod only give you more perfomance ,evga bios give extra 50 points in time spy but wont increase performance unless you do shunt mode first.


Thankyou for the reply. Unfortunately shunt mod is too advanced for me. i was just wondering if i could flash a 370 watt supported bios e.g gigabyte eagle oc and would give me little more..
But i guess the card is low end for overclocking even though i payed 200 euro more than its msrp value due to no stock.


----------



## ssgwright

Frohagen said:


> Managed to buy an Asus RTX 3080 TUF OC directly from Asus' online store around October 27. Looks like I got the only bad egg from them that I can find online. Everyone reported amazing temps from this GPU but mine sometimes maxes out at 86 degrees playing Escape from Tarkov (or Witcher 3, Watchdogs Legion, Path of Exile, etc).
> 
> Here are some screenshots of my GPU-Z, Asus Tweak II, MSI Dragon (system/cpu temps), and precision temps for good measure.
> 
> 
> http://imgur.com/a/qv6hxHl
> 
> 
> I have a Fractal Meshify C with two nice intake fans and one rear and one top exhaust. I tried removing the side panel and the front mesh filter but it made literally zero difference in the temp. The GPU idles at 38 to 43 degrees depending on what mode I place it in in Tweak II. I have never had this issue with any GPU before.
> 
> Seems karma decided it should offset my luck getting a 3080 by sending me one that performs poorly.


just making sure you realize your running your TUF on the "QUIET BIOS".. 48.41 is quiet, 48.40 is "PERFORMANCE"


----------



## Frohagen

ssgwright said:


> just making sure you realize your running your TUF on the "QUIET BIOS".. 48.41 is quiet, 48.40 is "PERFORMANCE"


Thanks. I actually do realize that. I received the same results from the stock bios as well as version 48.40. I tried various versions of Nvidia drivers too.


----------



## omarrana

Battler624 said:


> Nah i'm pretty sure that bios was full of issues, because running gpu-z shows the bios version as unknown *unless this is normal?*. anyway I tried the asus TUF one and it showed up normally in gpu-z so I do think its a problem from the gigabyte bios.
> 
> The bios in question is Gigabyte RTX 3080 VBIOS.
> 
> 
> Also Dr.rafi how are you getting your numbers? I got the tuf bios working but it is still not drawing more power (capped at 320W even tho bios shows and afterburner shows 117% powerlimit)


hi which tuf bios did you flash for your ventus? any improvement in result


----------



## PhoenixMDA

@dr.Rafi
With 20% 25mOhm more PL it´s ok by me with the TufOC.


----------



## Colonel_Klinck

Anyone have a link to the correct shunts for the ASUS cards?


----------



## EarlZ

I've been out of the loop on the 3080 crashing issue, has this been resolved via driver or a silent cart revision? Would it be safe to say that the Asus TUF model would be the safest choice to get ?


----------



## dr.Rafi

PhoenixMDA said:


> @dr.Rafi
> With 20% 25mOhm more PL it´s ok by me with the TufOC.
> View attachment 2464692


what you using to load the card ?


----------



## Falkentyne

Colonel_Klinck said:


> Anyone have a link to the correct shunts for the ASUS cards?


Buy like 3 packs of these so you have extras. One thing to keep in mind is that some boards have the silver sides of the original shunts lower than the middle. This will make stacking shunts tricky if you aren't soldering.









2.98US $ 20% OFF|50pcs Alloy Resistance 2512 Smd Resistor Samples Kit ,10 Kindsx5pcs=50pcs R001 R002 R005 R008 R010 R015 R020 R025 R050 R100 - Resistors - AliExpress


Smarter Shopping, Better Living! Aliexpress.com




www.aliexpress.com





If the original shunts are flush, then stacking shunts is easy without soldering, first make sure the conformal coating is completely removed from the original shunts--if there is any on there, just scratch the silver part with a small screwdriver until it becomes a bright silver color. Then apply a thin coat of conductive silver paint around the edges then stack the shunts on top, then let it dry. You can apply liquid electrical tape after it dries fully (wait like 30 minutes) to secure it.

just apply a thin coat of MG chemicals silver conductive paint to the edges. Use a toothpick for best results. 





MG Chemicals 842AR-15ML Silver Print (Conductive Paint), 12 ml: Amazon.com: Industrial & Scientific


MG Chemicals 842AR-15ML Silver Print (Conductive Paint), 12 ml: Amazon.com: Industrial & Scientific



www.amazon.com





If the original shunts have the silver edges lower than the middle, you can either solder the new shunts on top (solder has like no resistance at all), or you can apply silver conductive paint across the entire shunt, bridging the two edges, and don't even bother stacking new shunts on top. That's because if there is a gap where the edges are lower than the middle, you have to use extra paint which creates unwanted resistance (the paint and the new shunt) so it may act like a higher mOhm shunt (you would need to test this). Painting directly works well here; this works about the same as a 15 mOhm stacked shunt.


----------



## dr.Rafi

Colonel_Klinck said:


> Anyone have a link to the correct shunts for the ASUS cards?








ERJ-M1WSF5M0U Panasonic Electronic Components | Resistors | DigiKey


Order today, ships today. ERJ-M1WSF5M0U – 5 mOhms ±1% 1W Chip Resistor 2512 (6432 Metric) Current Sense Metal Element from Panasonic Electronic Components. Pricing and Availability on millions of electronic components from Digi-Key Electronics.




www.digikey.com.au




if you dont want to wait for ever for Aliexpress ,thats digikey Australia but they ship it from USA
but better to get 0.025 ohm instead of 0.005 that i used is bit agrressive on unlocking power and you might run in problem with the power supply or even killing your card if you not carfull and understand how it work .


----------



## dr.Rafi

Colonel_Klinck said:


> Anyone have a link to the correct shunts for the ASUS cards?


----------



## VPII

Okay, over the past month or so I've been playing around with a Palit RTX 3080 Gamingpro OC, Gigabyte Eagle RTX 3080 OC and now finally got a MSI RTX 3080 Gaming X Trio which is by far the best. The normal bios for this card works well and temps are pretty good. Taken that it does take 3 x 8 pin PCIe power connectors I did try the Asus Strix OC bios as well as the EVGA FTW3 ULTRA but temps were like 53 to 54c with fans at 100% where as with the MSI Gaming X Trio it would be 44 to 45c max. Yes I do understand that the increased power would mean increased heat but most of the benches I ran failed as well. Interestingly I've noticed now on three occasions that the max boost clocks per AIB is different. 

If you take the Palit card with base clock 1740mhz it boost at stock 2040 to 2055mhz which means 300 to 315mhz.

With the Eagle OC you have a base of 1755mhz it would boost at stock to 1980mhz which means 225mhz boost.

With the MSI Gaming X Trio you have a base clock of 1815mhz it boost at stock to 2040mhz which means 225mhz boost from base.

When using the Asus Strix OC bios you have a base clock of 1905mhz and the max boost I have seen while running the same benchmark was 2040 which means 135mhz boost from base.

When using the Evga FTW3 Ultra bios you have a base clock of 1800mhz and the max boost I have seen was 2025mhz and as such a boost of 225mhz from base.

Now correct me if I am wrong. But from what I remember with the RTX 2080 Ti I had, the boost was 300mhz+ from base which makes the Palit the only card that actually boost more or less the same as the RTX 2080 ti. Look I may be wrong, but this is what I have seen from my tests, actually having a Palit, Eagle OC and now MSI Gaming X Trio.


----------



## dr.Rafi

VPII said:


> Okay, over the past month or so I've been playing around with a Palit RTX 3080 Gamingpro OC, Gigabyte Eagle RTX 3080 OC and now finally got a MSI RTX 3080 Gaming X Trio which is by far the best. The normal bios for this card works well and temps are pretty good. Taken that it does take 3 x 8 pin PCIe power connectors I did try the Asus Strix OC bios as well as the EVGA FTW3 ULTRA but temps were like 53 to 54c with fans at 100% where as with the MSI Gaming X Trio it would be 44 to 45c max. Yes I do understand that the increased power would mean increased heat but most of the benches I ran failed as well. Interestingly I've noticed now on three occasions that the max boost clocks per AIB is different.
> 
> If you take the Palit card with base clock 1740mhz it boost at stock 2040 to 2055mhz which means 300 to 315mhz.
> 
> With the Eagle OC you have a base of 1755mhz it would boost at stock to 1980mhz which means 225mhz boost.
> 
> With the MSI Gaming X Trio you have a base clock of 1815mhz it boost at stock to 2040mhz which means 225mhz boost from base.
> 
> When using the Asus Strix OC bios you have a base clock of 1905mhz and the max boost I have seen while running the same benchmark was 2040 which means 135mhz boost from base.
> 
> When using the Evga FTW3 Ultra bios you have a base clock of 1800mhz and the max boost I have seen was 2025mhz and as such a boost of 225mhz from base.
> 
> Now correct me if I am wrong. But from what I remember with the RTX 2080 Ti I had, the boost was 300mhz+ from base which makes the Palit the only card that actually boost more or less the same as the RTX 2080 ti. Look I may be wrong, but this is what I have seen from my tests, actually having a Palit, Eagle OC and now MSI Gaming X Trio.


DOnt worry what is base and how much number you are boosting in after burner or on screen disply , the only important thing to compare is how much max score or fps each bios or card is giving you ,every bios have different numbers ,i use furmark in nonfullscreen mode with fps on screen disply to find quickly what are the max fps i am getting ,even some bioses use more power but less fps and some oposite


----------



## Frohagen

Falkentyne said:


> Just repaste it. Use a thick paste like Kryonaut Extreme or Thermalright TFX. If you have Kryonaut already, you can use that as well. These GPU's suck power.





dr.Rafi said:


> try to re apply tne thermal paste and make sure your heatsink is setting properly on the card .


I found a video of GamersNexus tearing down the card so I think I could figure my way out to repasting it but doesn't that void the warranty?


----------



## darkphantom

Just picked up the MSI Gaming X Trio 3080 - is +175 core and +700 memory good? I can't go above 102 on the power =( and had to use EVGA Precision because afterburner keeps crashing.


----------



## Falkentyne

Frohagen said:


> I found a video of GamersNexus tearing down the card so I think I could figure my way out to repasting it but doesn't that void the warranty?


In USA it does NOT void the warranty as long as you don't damage the card while repasting.


----------



## Frohagen

Falkentyne said:


> In USA it does NOT void the warranty as long as you don't damage the card while repasting.


That's good news then. I'll have to research tearing down the card. I've got some Grizzly Kryonaut on hand. I'll just need to be careful. Thanks for the info. Hopefully that'll get the temps down (even a 10 degree drop would be okay with me). Hitting mid 80's when people are reporting 20 degrees lower than that is just crummy.


----------



## cstkl1

Falkentyne said:


> This is wrong.
> You get vOP if you lock (L in MSI Afterburner V/F curve) the highest voltage point, like 1.081v-1.1v. People lock the voltage to prevent the card from downclocking in low usage games, to get higher FPS, but in most cases this is not necessary. May be useful for old benchmarks.
> 
> On some AMD cards, the card was failing to even enter 3D mode on some much older games because the GPU load is too low, which caused low FPS, so people had to write Clockblocker which would force the card into 3D mode. This is much easier to fix on Nvidia cards.
> 
> 
> 
> If your card is shunt modded, you will get this at low GPU usage and low temps if you lock 1.1v. Might happen on stock cards too but more likely to only see this at idle instead. Once card starts reducing clocks at -15 mhz / +6C, only VREL will appear until power limit (PWR) throttling..


if only what u think was true. theoretical vs practical this was via tested three cards with diff psu and watching gpu utilization with cpuz on few games.
no ocing. just whatever the default bios.

the psu was corsair rm750
it has only 1 direct 8pin + 2x8pin

can attribute it to cable also.


----------



## cstkl1

PhoenixMDA said:


> @cstkl1
> My one is maxed out Memory OC for this Plattform, i think the new Zen3 perform very well and some sample can drive over 4000 1:1,
> so it will go higher in Performance and will be faster then the Intel in TombRaider.


my take on zen 3.. out of the box. if u tuned the ram.. its beats a tuned out intel. @1080p. 
but 1440p no diff. 

this could be attributed to pcie 4.0 as well but its a easy recommend for ppl who want plug and play. intel has so much hidden potential which requires skill on oc/luck/board and cooling. its still enthusiast ocer choice.


----------



## VPII

darkphantom said:


> Just picked up the MSI Gaming X Trio 3080 - is +175 core and +700 memory good? I can't go above 102 on the power =( and had to use EVGA Precision because afterburner keeps crashing.


Okay, those numbers are not bad, but remember the increments is 15mhz, as such you will have 165mhz or 180mhz using those increments. 175mhz might give you +180mhz, but I cannot confirm that, but try and see +180 and then +165 if +180 does not work. With my MSI Gaming X Trio, I can bench all the way up to +180mhz in some benchmarks, but +165mhz in most and some I need to drop to +150mhz. The quickest way to see is if you run the benchmakr and look at your clock speeds, where it stays higher for longer you might see it crashing at +180 even maybe +165mhz but would pass +150mhz. 

I must say that I am pretty happy with this MSI Gaming X Trio card and I found that the stock bios works the best for me.


----------



## dr.Rafi

GitHub - bmgjet/ShutMod-Calculator: Work out what shunt values to use easily.


Work out what shunt values to use easily. Contribute to bmgjet/ShutMod-Calculator development by creating an account on GitHub.




github.com




anybody interested can calculate their shunt mod 
all credit to 
*bmgjet*


----------



## darkphantom

VPII said:


> Okay, those numbers are not bad, but remember the increments is 15mhz, as such you will have 165mhz or 180mhz using those increments. 175mhz might give you +180mhz, but I cannot confirm that, but try and see +180 and then +165 if +180 does not work. With my MSI Gaming X Trio, I can bench all the way up to +180mhz in some benchmarks, but +165mhz in most and some I need to drop to +150mhz. The quickest way to see is if you run the benchmakr and look at your clock speeds, where it stays higher for longer you might see it crashing at +180 even maybe +165mhz but would pass +150mhz.
> 
> I must say that I am pretty happy with this MSI Gaming X Trio card and I found that the stock bios works the best for me.


Yeah, I'm seeing that 175 is not stable, will drop it down to 165 or 150 for daily use. What are you getting on mem?


----------



## VPII

darkphantom said:


> Yeah, I'm seeing that 175 is not stable, will drop it down to 165 or 150 for daily use. What are you getting on mem?


I tested +750 first, but tested now +1000mhz and it worked as well. What I found was with my Palit a Gigabyte card is when I went up to 1000mhz core clock would be locked almost at max but performance would be really poor but with this MSI Gaming X Trio the +1000 memory works really well. I found that I can run benchmarks like Time Spy and Unigine Superposition at +165mhz to +180mhz but for gaming I had to drop to +135mhz which is still great with clock speeds past 2100mhz while gaming.


----------



## dr.Rafi

Did anybody try to flash 3080 with 3090 bios ?sound crazy ? I want to give it try .


----------



## PhoenixMDA

cstkl1 said:


> my take on zen 3.. out of the box. if u tuned the ram.. its beats a tuned out intel. @1080p.
> but 1440p no diff.
> 
> this could be attributed to pcie 4.0 as well but its a easy recommend for ppl who want plug and play. intel has so much hidden potential which requires skill on oc/luck/board and cooling. its still enthusiast ocer choice.


Yes that´s true, you need by Intel very good HW to get the best result´s, by AMD is @stock the better performance and it is easier to get now awesome performance with OC.


----------



## Falkentyne

dr.Rafi said:


> Did anybody try to flash 3080 with 3090 bios ?sound crazy ? I want to give it try .


Don't even think about it. The chips are not identical, the memory bus width is not identical...just don't.


----------



## desislaf

Hey guys, got my Palit 3080 Gamerock OC recently but I only able to increase clock speed by 70 and mem speed by 600. Time Spy crashes if I increase core clock slightly. I already maxed out power limit and core voltage. Do you think something is wrong with my card?


----------



## Colonel_Klinck

Falkentyne said:


> Buy like 3 packs of these so you have extras. One thing to keep in mind is that some boards have the silver sides of the original shunts lower than the middle. This will make stacking shunts tricky if you aren't soldering.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2.98US $ 20% OFF|50pcs Alloy Resistance 2512 Smd Resistor Samples Kit ,10 Kindsx5pcs=50pcs R001 R002 R005 R008 R010 R015 R020 R025 R050 R100 - Resistors - AliExpress
> 
> 
> Smarter Shopping, Better Living! Aliexpress.com
> 
> 
> 
> 
> www.aliexpress.com
> 
> 
> 
> 
> 
> If the original shunts are flush, then stacking shunts is easy without soldering, first make sure the conformal coating is completely removed from the original shunts--if there is any on there, just scratch the silver part with a small screwdriver until it becomes a bright silver color. Then apply a thin coat of conductive silver paint around the edges then stack the shunts on top, then let it dry. You can apply liquid electrical tape after it dries fully (wait like 30 minutes) to secure it.
> 
> just apply a thin coat of MG chemicals silver conductive paint to the edges. Use a toothpick for best results.
> 
> 
> 
> 
> 
> MG Chemicals 842AR-15ML Silver Print (Conductive Paint), 12 ml: Amazon.com: Industrial & Scientific
> 
> 
> MG Chemicals 842AR-15ML Silver Print (Conductive Paint), 12 ml: Amazon.com: Industrial & Scientific
> 
> 
> 
> www.amazon.com
> 
> 
> 
> 
> 
> If the original shunts have the silver edges lower than the middle, you can either solder the new shunts on top (solder has like no resistance at all), or you can apply silver conductive paint across the entire shunt, bridging the two edges, and don't even bother stacking new shunts on top. That's because if there is a gap where the edges are lower than the middle, you have to use extra paint which creates unwanted resistance (the paint and the new shunt) so it may act like a higher mOhm shunt (you would need to test this). Painting directly works well here; this works about the same as a 15 mOhm stacked shunt.



Thanks mate, ordered


----------



## VPII

desislaf said:


> Hey guys, got my Palit 3080 Gamerock OC recently but I only able to increase clock speed by 70 and mem speed by 600. Time Spy crashes if I increase core clock slightly. I already maxed out power limit and core voltage. Do you think something is wrong with my card?
> 
> View attachment 2464709


Silicon lottery I'd say..... Had a Eagle OC which could not manage clock speeds above 2070mhz. Check at stock what is your max boost clock by running Tomb Raider bench or Watch Dogs Legion at like 1080p as it would not draw that much power. Then subtract that speed to your base speed and see what is the boost freq. I found with my Palit Gamingpro OC that it was like 300 to 315mhz but with all of the other models I have it was like 225mhz. But check that and let me know.


----------



## aayman_farzand

Got myself a Colorful iGame 3080 Advanced. Not one of the usual brands but it was the only thing I could secure. Card works great and looks fantastic. Minimalistic + some tasteful lighting to remind me its for gaming.

I think this is the cheapest 3x8pin card, but I'm sure its power phases are not similar to Strix/FTW3. How are you guys testing your boosts? 3DMark only or something more?


----------



## Xipe

In the Gigabyte 3080 Eagle, wich bios i can put?


----------



## acrvr

Anyone tried mounting an AIO?


----------



## VPII

acrvr said:


> Anyone tried mounting an AIO?


I did with the Nzxt Kraken G12 on my Palit Gamingpro OC. Only issue was the mounting holes at the bottom was to close to the PCIe and as such the Nzxt fitting resulted in the card not slotting into the PCIe.... I did modify the fitting and got all working with a Corcair H110 cooler. It worked well, but I was somewhat disappointed as performance was not as great as when I used it with my RTX 2080 Ti. Temps touched 50c but that was with 30c ambient on that day.


----------



## Reinhardovich773

desislaf said:


> Hey guys, got my Palit 3080 Gamerock OC recently but I only able to increase clock speed by 70 and mem speed by 600. Time Spy crashes if I increase core clock slightly. I already maxed out power limit and core voltage. Do you think something is wrong with my card?
> 
> View attachment 2464709


I think you sadly lost the silicon lottery my friend...


----------



## VPII

Reinhardovich773 said:


> I think you sadly lost the silicon lottery my friend...


Maybe not exactly...... WHat I have seen with my Palit Gamingpro CO is that boost speed was 300 to 315mhz higher than base clock, now taken that the Gamingrock OC has a base clock of 1860 if same boost bahaviour applies you'll already sit with 2160mhz stock boost, now plus 70mhz as he stated would take you way ovr 2200mhz so the plus 70mhz may not be that bad. Which is why I asked him to see where his boost clock max out when running stock.


----------



## desislaf

VPII said:


> Silicon lottery I'd say..... Had a Eagle OC which could not manage clock speeds above 2070mhz. Check at stock what is your max boost clock by running Tomb Raider bench or Watch Dogs Legion at like 1080p as it would not draw that much power. Then subtract that speed to your base speed and see what is the boost freq. I found with my Palit Gamingpro OC that it was like 300 to 315mhz but with all of the other models I have it was like 225mhz. But check that and let me know.


Tested it on WD Legion and core clocks hits 2025 max. Gamerock OC's boost clock is 1860 so that makes 165mhz which is sad.. Should I return it and look for another card then? Bought it like couple of days ago and it is within 14 days return period still.

What I actually meant was +70 on top of stock boost clock.

@Reinhardovich773 bad news indeed.


----------



## Yoske13

Hi guys. Is this a good result for my 3080 rtx Evga ftw3 ultra?


https://www.3dmark.com/3dm/52763640


Graphics Score19 616 in Time Spy


----------



## VPII

desislaf said:


> Tested it on WD Legion and core clocks hits 2025 max. Gamerock OC's boost clock is 1860 so that makes 165mhz which is sad.. Should I return it and look for another card then? Bought it like couple of days ago and it is within 14 days return period still.
> 
> What I actually meant was +70 on top of stock boost clock.
> 
> @Reinhardovich773 bad news indeed.


Hi well not really, not all AIB partners boost the same.... Lower end cards has a higher boost clock than the higher end ones.


----------



## acrvr

VPII said:


> I did with the Nzxt Kraken G12 on my Palit Gamingpro OC. Only issue was the mounting holes at the bottom was to close to the PCIe and as such the Nzxt fitting resulted in the card not slotting into the PCIe.... I did modify the fitting and got all working with a Corcair H110 cooler. It worked well, but I was somewhat disappointed as performance was not as great as when I used it with my RTX 2080 Ti. Temps touched 50c but that was with 30c ambient on that day.


Do you still plan to give the AIO another try? 50 is pretty bad for a 280mm.


----------



## Johneey

acrvr said:


> Do you still plan to give the AIO another try? 50 is pretty bad for a 280mm.


what? 20c delta (ambiente - gpu) for a small 280mm is good dude


----------



## asdkj1740

acrvr said:


> Do you still plan to give the AIO another try? 50 is pretty bad for a 280mm.


thats so great actually.
what do you expect?


----------



## VPII

acrvr said:


> Do you still plan to give the AIO another try? 50 is pretty bad for a 280mm.


I may try it at a later stage with my Gaming X Trio, but the mem clocks bothered me as no cooling.


----------



## Colonel_Klinck

dr.Rafi said:


>


So are people just adding shunts to the 2 shunts directly below the 2 8pins? or are you shunting all 5 and the PCI-E one? I have the 3080 TUF so the same as the board Roman is working on.


----------



## Falkentyne

Colonel_Klinck said:


> So are people just adding shunts to the 2 shunts directly below the 2 8pins? or are you shunting all 5 and the PCI-E one? I have the 3080 TUF so the same as the board Roman is working on.


You need to shunt all of the shunts on the board. If you don't, the PCIE slot limit will limit the power draw from the 8 pins, and there's even chip power to worry about. The Strix uses a lower ratio of PCIE to 8 pins, so there is some benefit to just shunting the three 8 pins, but for best results shunt them all.

If you don't shut the "GPU power shunt", then even if the slot and three 8 pins are all shunted, then the GPU chip will end up limiting the power.


----------



## Colonel_Klinck

Falkentyne said:


> You need to shunt all of the shunts on the board. If you don't, the PCIE slot limit will limit the power draw from the 8 pins, and there's even chip power to worry about. The Strix uses a lower ratio of PCIE to 8 pins, so there is some benefit to just shunting the three 8 pins, but for best results shunt them all.
> 
> If you don't shut the "GPU power shunt", then even if the slot and three 8 pins are all shunted, then the GPU chip will end up limiting the power.


Yeah I'd seen that on another video but de8auer said only bother with the 2 by the 2x 8 pins on the TUF


----------



## dr.Rafi

Colonel_Klinck said:


> So are people just adding shunts to the 2 shunts directly below the 2 8pins? or are you shunting all 5 and the PCI-E one? I have the 3080 TUF so the same as the board Roman is working on.


to unlock evrything you have to shunt all 5 (4 beside the connectors and one further away) , but the sixth one for pcie is close to that connector you can shunt it to increase the power more and also some claim in 3090 forum is still limiting the power if you dont shunt it because is maxing out .,and try find the calculator i post earlier to calculate exactly how much max power you after , for 3 x 8 pins card ,ithhink no need to do any shunts because maxing @ 450 watt is fair enough to get all performance you need for 24/7 , passing that limit will start you getting in thermal issues


----------



## Falkentyne

Colonel_Klinck said:


> Yeah I'd seen that on another video but de8auer said only bother with the 2 by the 2x 8 pins on the TUF


That depends on how low the PCIE slot ratio is in comparison to the two 8 pins. 
If the Strix is at Power limit with 75W PCIE max and 150W * 2 from the 8 pins, you are not going to get anywhere not shunting the PCIE slot.
If it's 55W from PCIE and the two 8 pins are drawing 170 * 2, then you can get away with it. You still have chip (and possibly SRC and MVDDC) that can mess you up however.


----------



## Johneey

U need to shunt all. Otherwise u run into pcie power limit and get higher heat for nothing . I tried by my self


----------



## Nizzen

Johneey said:


> U need to shunt all. Otherwise u run into pcie power limit and get higher heat for nothing . I tried by my self


My 3090 strix has all shunts modded. It works great.


----------



## Colonel_Klinck

Ok thanks guys. I've ordered enough to do all of them


----------



## woppy101

I managed to pick up an 3080 Aorus master but am missing the neatness of everything being watercooled, is there any blocks out there that will fit the master?


----------



## Battler624

omarrana said:


> hi which tuf bios did you flash for your ventus? any improvement in result


no improvements at all, card is limited by hardware (cant push more than 320w without modifying hardware)


----------



## Colonel_Klinck

Falkentyne said:


> That depends on how low the PCIE slot ratio is in comparison to the two 8 pins.
> If the Strix is at Power limit with 75W PCIE max and 150W * 2 from the 8 pins, you are not going to get anywhere not shunting the PCIE slot.
> If it's 55W from PCIE and the two 8 pins are drawing 170 * 2, then you can get away with it. You still have chip (and possibly SRC and MVDDC) that can mess you up however.


At the moment its pulling about max 150w on each of the 8 pin and 59.6w on the PCI-E. Its a 2* 8 pin TUF so 375w BIOS, athough the max power its shown used in GPU-Z and HWiNFO is 359w


----------



## Falkentyne

Colonel_Klinck said:


> At the moment its pulling about max 150w on each of the 8 pin and 59.6w on the PCI-E. Its a 2* 8 pin TUF so 375w BIOS, athough the max power its shown used in GPU-Z and HWiNFO is 359w


You can try just modding the two 8 pins, but it will throw off the power delivery reporting since all the shunts will then be using different total values. Just play it safe and mod all of them. Do it right the first time and don't take the lazy way out.

all you need are wooden toothpicks, MG Chemicals silver conductive paint, replacement thermal pads (sizes depend on your card) and thermal paste. And a steady hand.

Your shunts will look like this after modding.










Remember you don't need to solder or even to stack shunts. You can scrape the "silver edges" of the shunts with a tiny flat screwdriver to remove the conformal coating (I don't know which boards have coating, FE cards do for sure), then paint over the entire shunt to "bridge" the edges with MG Chemicals SILVER conductive paint. This paint is 15 mOhms exactly. So you are safe in using a 1.33x HWinfo multiplier afterwards. Be safe and always cover up the jar after every time you dip the toothpick in and use as little as possible while painting over the entire shunt. It's a lot safer than soldering if you aren't experienced. Let dry 15 minutes before flipping the card over to do the opposite side shunts.


----------



## shallow_

Disclaimer: This is not meant to offend any of the enthusiasts in here.

So.. Ive been doing some really crazy stuff with my RTX 3080 Strix since I got it last friday..

I installed it in my PC, and I have been gaming on it ever since!! Yeah, crazy right ?? 










Oh, btw, I did actually do a slight mod on the card.. I took a black permanent marker to the side of the pcb as the yellow looked horrible


----------



## ssgwright

Falkentyne said:


> You can try just modding the two 8 pins, but it will throw off the power delivery reporting since all the shunts will then be using different total values. Just play it safe and mod all of them. Do it right the first time and don't take the lazy way out.
> 
> all you need are wooden toothpicks, MG Chemicals silver conductive paint, replacement thermal pads (sizes depend on your card) and thermal paste. And a steady hand.
> 
> Your shunts will look like this after modding.
> 
> View attachment 2464757
> 
> 
> Remember you don't need to solder or even to stack shunts. You can scrape the "silver edges" of the shunts with a tiny flat screwdriver to remove the conformal coating (I don't know which boards have coating, FE cards do for sure), then paint over the entire shunt to "bridge" the edges with MG Chemicals SILVER conductive paint. This paint is 15 mOhms exactly. So you are safe in using a 1.33x HWinfo multiplier afterwards. Be safe and always cover up the jar after every time you dip the toothpick in and use as little as possible while painting over the entire shunt. It's a lot safer than soldering if you aren't experienced. Let dry 15 minutes before flipping the card over to do the opposite side shunts.


can you link me to the silver conductive paint? I'm getting many different options, pen, paste, adhesive... etc. Yours looks like maybe a paste? I used liquid metal and got amazing results but I removed it because I don't want it to eat the solder and end up having my resistor fall off (like it did on my 1080 lol)

is this what you used?: https://www.walmart.com/ip/MG-Chemi...924684?wmlspartner=wlpa&selectedSellerId=3354


----------



## dr.Rafi

Colonel_Klinck said:


> Yeah I'd seen that on another video but de8auer said only bother with the 2 by the 2x 8 pins on the TUF


Dont only follow what derbaur said it was very new even to him on that vedio and he is not worry like us day and night trying everything.
Any power shunt on the card get maxed will cap the rest , and also better to do all the shunts it keep the balance between the power sources and wont confuse the current blancing chips that regulat and balance current ,I refered you to Derbaur vedio to learn the technique how to do it , and as i menstioned before start with 0.025 ohm shunt resistors better , if the card reach max overclock limit (to crash point )with no power LIM sighns in GPUZ and after burner ,then thats your max no need to go further you wont get any extra performance by increasing power more.


----------



## Falkentyne

ssgwright said:


> can you link me to the silver conductive paint? I'm getting many different options, pen, paste, adhesive... etc. Yours looks like maybe a paste? I used liquid metal and got amazing results but I removed it because I don't want it to eat the solder and end up having my resistor fall off (like it did on my 1080 lol)
> 
> is this what you used?: https://www.walmart.com/ip/MG-Chemi...924684?wmlspartner=wlpa&selectedSellerId=3354


Yes it's a grey paste looking material, but looks more like toxic smelly paint and is a lot more liquid than paste, but dries quickly, like paint does.
What you want is "842AR" for the type.






MG Chemicals 842AR-15ML Silver Print (Conductive Paint), 12 ml: Amazon.com: Industrial & Scientific


MG Chemicals 842AR-15ML Silver Print (Conductive Paint), 12 ml: Amazon.com: Industrial & Scientific



www.amazon.com


----------



## DStealth

How you meassure the resistance with this method - datasheet here - 842AR-P - Silver Conductive Pen - MG Chemicals
States values based on Ω/cm ( Volume resistivity )


Falkentyne said:


> This paint is 15 mOhms exactly. So you are safe in using a 1.33x HWinfo multiplier afterwards.


----------



## Colonel_Klinck

dr.Rafi said:


> Dont only follow what derbaur said it was very new even to him on that vedio and he is not worry like us day and night trying everything.
> Any power shunt on the card get maxed will cap the rest , and also better to do all the shunts it keep the balance between the power sources and wont confuse the current blancing chips that regulat and balance current ,I refered you to Derbaur vedio to learn the technique how to do it , and as i menstioned before start with 0.025 ohm shunt resistors better , if the card reach max overclock limit (to crash point )with no power LIM sighns in GPUZ and after burner ,then thats your max no need to go further you wont get any extra performance by increasing power more.


So that is all 6 shunts then. 5 by the 2* 8 pins and 1 by the PCE-E?


----------



## Falkentyne

Colonel_Klinck said:


> So that is all 6 shunts then. 5 by the 2* 8 pins and 1 by the PCE-E?
> 
> View attachment 2464839
> 
> 
> View attachment 2464840


Yep those are the ones.


----------



## PhoenixMDA

Colonel_Klinck said:


> So that is all 6 shunts then. 5 by the 2* 8 pins and 1 by the PCE-E?
> 
> View attachment 2464839
> 
> 
> View attachment 2464840


Yes i have done all 6 Shunt´s at first by 25mOhm, i think with Air dont do more, today i have changed to 20mOhm, this is for my Card enough to do 2100Mhz+ in heavy load Game´s with watercooling.But i waiting for my waterblock.


----------



## dr.Rafi

PhoenixMDA said:


> Yes i have done all 6 Shunt´s at first by 25mOhm, i think with Air dont do more, today i have changed to 20mOhm, this is for my Card enough to do 2100Mhz+ in heavy load Game´s with watercooling.But i waiting for my waterblock.


Ordered Bykski and waiting too
View attachment 2464882

At least i have the cpu water block setup back on ,but only overclocking the gpu core in games. and switched to 0.010 ohm , and some times with high fps low core voltage games i just bring the power slider back to save power .


----------



## darkphantom

How to bypass 102 power limit for msi cards?


----------



## DirtyScrubz

After messing around a bit more on air, I got the card to peak at 2175 mhz but it crashed at the final part of Port Royal. I think on water I'll be able to stabilize it better w/added voltage.















Edit: Ran timespy, it peaked at 2160 then dropped to 2145 mhz b/c of power/vrel and temp:
https://www.3dmark.com/spy/15154741

No shunt mods or anything, all stock from the factory w/the obvious exception of upping PL/temp/clocks. All I need now is the EK block to release and I'll be happy.


----------



## PhoenixMDA

dr.Rafi said:


> Ordered Bykski and waiting too
> View attachment 2464882
> 
> At least i have the cpu water block setup back on ,but only overclocking the gpu core in games. and switched to 0.010 ohm , and some times with high fps low core voltage games i just bring the power slider back to save power .


Yes it´s terrible i´m waiting so long for this fu... block.
I think 20mOhm is enough, i can get good Mhz.


----------



## SoldierRBT

Unigine Heaven benchmark 2235MHz 1.10v locked/ +1200 memory on air max temp 61C.


----------



## dr.Rafi

DirtyScrubz said:


> After messing around a bit more on air, I got the card to peak at 2175 mhz but it crashed at the final part of Port Royal. I think on water I'll be able to stabilize it better w/added voltage.
> 
> View attachment 2464892
> View attachment 2464893
> 
> 
> Edit: Ran timespy, it peaked at 2160 then dropped to 2145 mhz b/c of power/vrel and temp:
> https://www.3dmark.com/spy/15154741
> 
> No shunt mods or anything, all stock from the factory w/the obvious exception of upping PL/temp/clocks. All I need now is the EK block to release and I'll be happy.


Strix?
you see as i said max clock or average clock isnot important to get max performance ,the graphic score or fps is important .


https://www.3dmark.com/spy/15030476


if you look my average and max clocks is less than yours but look at graphic score, i Know the cpu score also affect but i seen graphic scores of 20150 on time spy with 8700k cpu of 10000 cpu score ,so cpu score affect the graphic score of around minus plus 200 points not more .
I think some manifacturers are cheating with boost clock by offceting the real number of gpu boosting clocks for marketing purposes.


----------



## VPII

DirtyScrubz said:


> After messing around a bit more on air, I got the card to peak at 2175 mhz but it crashed at the final part of Port Royal. I think on water I'll be able to stabilize it better w/added voltage.
> 
> View attachment 2464892
> View attachment 2464893
> 
> 
> Edit: Ran timespy, it peaked at 2160 then dropped to 2145 mhz b/c of power/vrel and temp:
> https://www.3dmark.com/spy/15154741
> 
> No shunt mods or anything, all stock from the factory w/the obvious exception of upping PL/temp/clocks. All I need now is the EK block to release and I'll be happy.





dr.Rafi said:


> Strix?
> you see as i said max clock or average clock isnot important to get max performance ,the graphic score or fps is important .
> 
> 
> https://www.3dmark.com/spy/15030476
> 
> 
> if you look my average and max clocks is less than yours but look at graphic score, i Know the cpu score also affect but i seen graphic scores of 20150 on time spy with 8700k cpu of 10000 cpu score ,so cpu score affect the graphic score of around minus plus 200 points not more .
> I think some manifacturers are cheating with boost clock by offceting the real number of gpu boosting clocks for marketing purposes.


I am going to lay something else out there as well. When your memory clock is too high what would happen is the performance would be limited where gpu clocks stay very high as it uses less power with the memory limiting performance then your max and average clock speed would be higher but performance way lower.


----------



## dr.Rafi

VPII said:


> I am going to lay something else out there as well. When your memory clock is too high what would happen is the performance would be limited where gpu clocks stay very high as it uses less power with the memory limiting performance then your max and average clock speed would be higher but performance way lower.


Memory affect very little around also mius plus 100 to 200 points


----------



## ssgwright

SoldierRBT said:


> Unigine Heaven benchmark 2235MHz 1.10v locked/ +1200 memory on air max temp 61C.


how did you lock the voltage?


----------



## SoldierRBT

ssgwright said:


> how did you lock the voltage?


Set core voltage slider to 100% and then locked 1.10v in curve editor (press L and apply).


----------



## dr.Rafi

SoldierRBT said:


> Unigine Heaven benchmark 2235MHz 1.10v locked/ +1200 memory on air max temp 61C.






2410 mhz on water nothing locked


----------



## SoldierRBT

dr.Rafi said:


> 2410 mhz on water nothing locked


Nice. Resolution? Looks like 720p. Mine was at 3440x1440. What was your clocks on air? Time Spy score?


----------



## dr.Rafi

SoldierRBT said:


> Nice. Resolution? Looks like 720p. Mine was at 3440x1440. What was your clocks on air? Time Spy score?





https://www.3dmark.com/spy/15030476


----------



## dr.Rafi

SoldierRBT said:


> Nice. Resolution? Looks like 720p. Mine was at 3440x1440. What was your clocks on air? Time Spy score?


The resolution is only for capture but you still didnt get what i meant by the second picture .think about it .


----------



## PhoenixMDA

@dr.Rafi 
But this is not your card?VoltMod +Open PL or?

@SoldierRBT 
Nice, i have tested i have more FPS and go in Powerlimit^^


----------



## dr.Rafi

PhoenixMDA said:


> @dr.Rafi
> But this is not your card?VoltMod +Open PL or?
> 
> @SoldierRBT
> Nice, i have tested i have more FPS and go in Powerlimit^^


i will let you know soon wait for the next vedio in 10 minutes


----------



## VPII

dr.Rafi said:


> Memory affect very little around also mius plus 100 to 200 points


Well I have had:

no 1. Palit RTX 3080 Gamingpro OC
no 2. Gigabyte RTX 3080 Eagle OC
no 3. MSI RTX 3080 Gaming X Trio (which I use now)

What I found with the Palit which could easily do +750mhz same as Gigabyte is when I increased the memory to say 1000mhz during Time Spy you would see core clock speeds basically staying almost max but the frames per second way lower than normal. Now it has been said on this forum that the GDDR6X has error correction, so maybe this is why it happens. I have not tested it with the MSI but will try something silly like +1250mhz and see if I get the same behaviour. I am just saying it as I see it.


----------



## ssgwright

I just tried this "force voltage" by hitting "L" and then apply in afterburner curve but it does nothing... as soon as you push the gpu the voltages start changing...


----------



## dr.Rafi

PhoenixMDA said:


> @dr.Rafi
> But this is not your card?VoltMod +Open PL or?
> 
> @SoldierRBT
> Nice, i have tested i have more FPS and go in Powerlimit^^






4040 on core clock and 15300 on memeroy locked to 1.093v


----------



## SoldierRBT

ssgwright said:


> I just tried this "force voltage" by hitting "L" and then apply in afterburner curve but it does nothing... as soon as you push the gpu the voltages start changing...


It's probably hitting PL that's why the voltage drops. You need to first lock the voltage where it doesn't hit PL and then try to push core as much as you can. Only on light loads you can lock 1.10v. In Portal Royal my max is 1.056v (2200MHz avg) and 1.012v (2160MHz avg) in Time Spy. If you shunt mod the card you can push higher voltage in heavy loads.


----------



## SoldierRBT

dr.Rafi said:


> 4040 on core clock and 15300 on memeroy locked to 1.093v





https://www.3dmark.com/spy/14951377


----------



## dr.Rafi

SoldierRBT said:


> https://www.3dmark.com/spy/14951377


Yes seen it before, the only Score or data i belive is 3dmark other than that is so easy to be modded by offceting the numbers , gpuz still ligitmate also because you cant change the sensors data.


----------



## PhoenixMDA

@dr.Rafi 
Its easier to works with 1Point in Games, with curve it's more instable.If the Load not to heavy you can hold arround 2200 +/- , i can do less than soldierRBT because of Chip quality or i need lower temp's.


----------



## Zeakie

Hows valhalla running for ye? I'm running stable 2040mhz locked at 0.95v. +500 on mem. 3900x. 65fps avg in benchmark 4k maxed AA on low. on trinity oc


----------



## ZOONAMI

Just under 19k graphics timespy is an acceptable score for an Asus Tuf? I have pretty much pushed it as far as it will go stable.

3dmark says I am top 11% for an 8700k and 3080 so I guess I'll take it. 

Timespy clocks seem to hover a bit under 2000mhz, in games it's able to stay above 2000.


----------



## dr.Rafi

ZOONAMI said:


> Just under 19k graphics timespy is an acceptable score for an Asus Tuf? I have pretty much pushed it as far as it will go stable.
> 
> 3dmark says I am top 11% for an 8700k and 3080 so I guess I'll take it.
> 
> Timespy clocks seem to hover a bit under 2000mhz, in games it's able to stay above 2000.


Shunt mod and you will get 20000+ graphic time spy


----------



## SEALBoy

I've searched this thread but not really seen anything definitive on this...

Is there a BIOS that actually raises the Ventus' power limit above 320W (not just reading higher but actually higher)? Or is shunt mod the only way?


----------



## ZOONAMI

dr.Rafi said:


> Shunt mod and you will get 20000+ graphic time spy


Do I need to solder for that? If so I'm fine with a 5% diff. In games it sits over 2000mhz comfortably.


----------



## ssgwright

hmm I can't get the silver conductive paint here in hawaii, it can't be shipped by air and I can't seem to find it anywhere on island... thinking about just using solder but I'm worried it will put the card in limp mode


----------



## SoldierRBT

The new Raytracing test can run high clocks. 



https://www.3dmark.com/dxr/15527


----------



## Falkentyne

ssgwright said:


> hmm I can't get the silver conductive paint here in hawaii, it can't be shipped by air and I can't seem to find it anywhere on island... thinking about just using solder but I'm worried it will put the card in limp mode


Solder has like no resistance at all. It just bridges connections. So you can just stack a 10 to 15 mOhm shunt on top of the original one and solder it on. Make sure to scrape off the conformal coating off the silver edges of the original shunt, with a small flat screwdriver first, until it becomes shiny, then clean it with isopropyl alcohol (be careful and do it slowly, a slip can be fatal to the board).

I can't solder things that small, but soldering 15-20 mOhm shunts on top of the original shunts (6 shunts, 2512 size, current sensing shunts) will not put the board into safe mode.


----------



## PhoenixMDA

SoldierRBT said:


> The new Raytracing test can run high clocks.
> 
> 
> 
> https://www.3dmark.com/dxr/15527


In this Test you can do much more higher when i, i can start by 2235Mhz.I can do 51FPS max.
Your Chip is awesome...gz soldier^^I can hold only 2Ghz


----------



## ssgwright

SoldierRBT said:


> The new Raytracing test can run high clocks.
> 
> 
> 
> https://www.3dmark.com/dxr/15527


you're a beast Soldier... best I could get was 2190: *51.81* (on my 2x8 pin TUF ) http://www.3dmark.com/dxr/15581


----------



## ssgwright

uh oh.. just hit *52.12 *http://www.3dmark.com/dxr/15583 this was at 2205 mhz


----------



## dr.Rafi

ssgwright said:


> hmm I can't get the silver conductive paint here in hawaii, it can't be shipped by air and I can't seem to find it anywhere on island... thinking about just using solder but I'm worried it will put the card in limp mode


practice on any dead or old pcb first


----------



## ssgwright

dr.Rafi said:


> practice on any dead or old pcb first


ah.. good idea, thank you Dr


----------



## ssgwright

i've soldered before and I've seen people using some kind of gel or something after applying the solder?


----------



## Falkentyne

dr.Rafi said:


> practice on any dead or old pcb first


Excellent advice. You can even disassemble an old mouse or game controller and find something to stack or stick a shunt on and then stack another one for practice.
You can even practice soldering then desoldering the shunt and soldering it again, since those will be junk shunts anyway.

You could also theoretically grab some throwaway shunt (like a 1 or 100 mOhm one), attach it it onto a solid expendable solder safe surface or some work bench with solder, if you don't have access to a PCB, and then practice soldering another shunt on top of it. 

I know very little about soldering. The only thing I've ever soldered are mouse and keyboard microswitches and that's already hard enough now with my back problems. And I know the key is to use flux to help 'direct' the solder flow. 









Understanding Soldering - Part 4: How to Use Flux When Soldering Electronics | Tempo


When soldering electronics, what is the best usage of different types of flux? Learn all about flux in the fourth article of our Understanding Soldering series.




www.tempoautomation.com





Important since you want solder to go to the joint you are soldering, and not to some place you don't want it to be. This can be a heart stopper if you aren't experienced (which is why I avoid trying to work with very small objects. Mouse microswitches are already at my limit of soldering skill).


----------



## dr.Rafi

Falkentyne said:


> Excellent advice. You can even disassemble an old mouse or game controller and find something to stack or stick a shunt on and then stack another one for practice.
> You can even practice soldering then desoldering the shunt and soldering it again, since those will be junk shunts anyway.
> 
> You could also theoretically grab some throwaway shunt (like a 1 or 100 mOhm one), attach it it onto a solid expendable solder safe surface or some work bench with solder, if you don't have access to a PCB, and then practice soldering another shunt on top of it.
> 
> I know very little about soldering. The only thing I've ever soldered are mouse and keyboard microswitches and that's already hard enough now with my back problems. And I know the key is to use flux to help 'direct' the solder flow.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Understanding Soldering - Part 4: How to Use Flux When Soldering Electronics | Tempo
> 
> 
> When soldering electronics, what is the best usage of different types of flux? Learn all about flux in the fourth article of our Understanding Soldering series.
> 
> 
> 
> 
> www.tempoautomation.com
> 
> 
> 
> 
> 
> Important since you want solder to go to the joint you are soldering, and not to some place you don't want it to be. This can be a heart stopper if you aren't experienced (which is why I avoid trying to work with very small objects. Mouse microswitches are already at my limit of soldering skill).





ssgwright said:


> ah.. good idea, thank you Dr


Easy to find shunts in computer switching power supplies,or any old tv power supplies to practic on or even to use it but only issue that you can read their values in ohm but you cant know what is their rated wattage best to use 1 watt or more rated. for high end soldering practic to look professional , you need flux liquid with syringe, and copper desoldering braid








with flux to remove any access solder left to make it look professional, this braid absorb the melted solder very quickly and leave only enough to keep a good electric contact, if there is tiny capacitors and other components beside the shunt better to put kapton tape covering them to protect them by droping solder which short them and sometime even wont show any issues with tiny shorts but might degrade certain other components with time so be carefull and finally better to inspect the soldered area by magnifying glass to be sure no mistakes done.


----------



## ssgwright

dr.Rafi said:


> Easy to find shunts in computer switching power supplies,or any old tv power supplies to practic on or even to use it but only issue that you can read their values in ohm but you cant know what is their rated wattage best to use 1 watt or more rated. for high end soldering practic to look professional , you need flux liquid with syringe, and copper desoldering braid
> View attachment 2465056
> 
> with flux to remove any access solder left to make it look professional, this braid absorb the melted solder very quickly and leave only enough to keep a good electric contact, if there is tiny capacitors and other components beside the shunt better to put kapton tape covering them to protect them by droping solder which short them and sometime even wont show any issues with tiny shorts but might degrade certain other components with time so be carefull and finally better to inspect the soldered area by magnifying glass to be sure no mistakes done.


ok... now I'm lost


----------



## dr.Rafi

ssgwright said:


> ok... now I'm lost






watch this vedio ,i learned alot from this man.


----------



## DStealth

or just get a conductive glue


----------



## asdkj1740

3080 vision 5+1 again, seems gigabyte is changing to 5+1 on new producton batch.









創作至上! GIGABYTE GeForce RTX 3080 VISION OC 10G 評測開箱


終於收到不少玩家感到好奇的 GIGABY




unikoshardware.com


----------



## naikee

I was able to purchase the MSI RTX 3080 Ventus x3 OC (v2 PCB), is it really that bad compared to the others?


----------



## Anth0789

Just got my Gigabyte RTX 3080 Monday, wow what a jump compared to my old GTX 1070.


----------



## asdkj1740

naikee said:


> I was able to purchase the MSI RTX 3080 Ventus x3 OC (v2 PCB), is it really that bad compared to the others?


how do you know it is v2? can you show us the pcb?


----------



## naikee

asdkj1740 said:


> how do you know it is v2? can you show us the pcb?


It had one of the better MLCC's, not all 6 were those other black plastic ones.


----------



## SEALBoy

naikee said:


> It had one of the better MLCC's, not all 6 were those other black plastic ones.


All shipped Ventus cards are like that.


----------



## ssgwright

here's my latest PR: *12,705 *


http://www.3dmark.com/pr/492238


----------



## xermalk

Anyone know anything about the INNO3D RTX 3080 iChill Frostbite ?
I have one on order, but man.that has to be the least amount of copper iv ever seen in a gpu waterblock.









Inno3D launches GeForce RTX 3090 and RTX 3080 iChill Frostbite series - VideoCardz.com


INNO3D GEFORCE RTX 3090 / 3080 GETS THE ICHILL TREATMENT WITH FROSTBITE Hong Kong – November 9th, 2020 – INNO3D, a leading manufacturer of pioneering high-end multimedia components and innovations brings you the new INNO3D GeForce RTX 3090 / 3080 iCHILL Frostbite. Following the huge success of...




videocardz.com


----------



## dr.Rafi

ssgwright said:


> here's my latest PR: *12,705 *
> 
> 
> http://www.3dmark.com/pr/492238


With Shunts?


----------



## BulletSponge

Add me to the list, just got my latest addition installed.  The 1080ti is going to a friend, it was in my daughters rig until now.


----------



## ssgwright

dr.Rafi said:


> With Shunts?


Yes shunt mod applied, before gpu-z would read 350w pull now it max's at 260w


----------



## dr.Rafi

ssgwright said:


> Yes shunt mod applied, before gpu-z would read 350w pull now it max's at 260w


Congratulation for succesfull shunting and soldering, just wonder what was your max port royal score before modding ?


----------



## ssgwright

no need for congrats.. lol... I got scared of soldering so I'm running liquid metal. Not for long, I will do the soldering soon.

this was the best I could get before the shunt mod: http://www.3dmark.com/pr/484350 *12272*

has anyone seen an instance where a 30 series card has gone into safe mode? I used a lot of liquid metal (not so much that it could drip) but I used so much on all shunts that I assumed it would go into safe mode but it didn't.


----------



## BluePaint

TS 20.029 GPU
https://www.3dmark.com/spy/15191750
2205 Mhz max 2148 Mhz avg
voltage +75%, +111Mhz core, +1275 VRAM on chilled air, [email protected] core

So far, highest score for 3080 + 3900X combo on 3dmark.
For more GPU points I need a different CPU + RAM I guess. I actually have a 5800X here but AGESA/MSI bios seems to be bugged for FCLK/RAM overclocking which is annoying.

Respect for all that shunting and soldering! Guess I am too chicken for that, lol.


----------



## DStealth

Great card you have ...my best with this combo was 200points less and a massive -350 on the GPU score


https://www.3dmark.com/compare/spy/15191750/spy/15084817


----------



## BluePaint

@DStealth
Thanks. Great to see u here! I was wondering how u manage to get 4675Mhz on your 3900X (voltage, cooling, ...) ? Really great CPU score!
Regarding the GPU points, your average GPU frequency is also pretty good. Therefore, RAM could be a factor but also your NVIDIA control panel settings. Phoenix posted a screenshot of best CP settings a few pages back I think.

Here is my RAM for comparison:


----------



## DStealth

Chilled water ...can bench much higher 4750 CB15/20 but not pushing it hard as the card is not good...+500 Vram is reducing the scores max +200-300 for top FTW3 Ultra card...seems very bad.
My memory is better @3840Mhz


----------



## BluePaint

@DStealth
Esp latency is great! Should have expected that your RAM is already very optimized .
So the VRAM seems to be the weak point, which is annoying. Personally, I am looking forward to upcoming 20GB VRAM version cause 10GB is already too little in some cases (I mostly use extra high res rather than high FPS). Then the lottery starts again.


----------



## asdkj1740

AORUS GeForce RTX™ 3080 XTREME WATERFORCE 10G (rev. 1.0) Key Features | Graphics Card - GIGABYTE Global


Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!




www.gigabyte.com












AORUS GeForce RTX™ 3080 XTREME WATERFORCE WB 10G (rev. 1.0) Key Features | Graphics Card - GIGABYTE Global


Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!




www.gigabyte.com






xtreme waterforce dual 8pin
xtreme air triple 8 pin
well done, well done gigabyte


----------



## Vapochilled

asdkj1740 said:


> AORUS GeForce RTX™ 3080 XTREME WATERFORCE 10G (rev. 1.0) Key Features | Graphics Card - GIGABYTE Global
> 
> 
> Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!
> 
> 
> 
> 
> www.gigabyte.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AORUS GeForce RTX™ 3080 XTREME WATERFORCE WB 10G (rev. 1.0) Key Features | Graphics Card - GIGABYTE Global
> 
> 
> Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!
> 
> 
> 
> 
> www.gigabyte.com
> 
> 
> 
> 
> 
> 
> xtreme waterforce dual 8pin
> xtreme air triple 8 pin
> well done, well done gigabyte



Any idea about the TDP for these cards ?


----------



## lordzed83

Add Me When PREORDER comes same day as lucky FE. Mate gets EAGLE i got FE


----------



## lordzed83

Anyway I got alpha software I'w glued together that lets You check if memory overclock produces ERRORS. Nothing to epic but does the job
Initial tests







Gives resoult like this now







Link to my Onedrive


----------



## Falkentyne

lordzed83 said:


> Anyway I got alpha software I'w glued together that lets You check if memory overclock produces ERRORS. Nothing to epic but does the job
> Initial tests
> View attachment 2465190
> 
> Gives resoult like this now
> View attachment 2465191
> 
> Link to my Onedrive


I don't believe this test is reliable.
Testing this on my 3090

What I don't understand is that +600 mhz has no errors.in your test and is stable in games. (Core +135)
But 700 mhz memory (Core +135) just crashes the driver in games / black screens / freezes then reboots the system in games, but 0 errors in your test.

So I think this test is not reliable.
The memory overclock may depend on the core clock and the core being loaded in 3D as well.
Because it is impossible to pass +135 / +700 in Heaven benchmark. The system crashes...but no errors in your test.

_Edit_ yep I just confirmed this test can't be used reliably.
+135 / +700: 0 errors

+0 / +700: Heaven crashes the driver (Actually it rebooted the computer after 3 minutes of running! Checked windows event log--the driver didn't even recover or report a hardware event. It just hard locked and rebooted). So in this case it isn't even the core overclock. Core was stock +0. +700 RAM is just not stable.

Looping Heaven benchmark is still the best test for overclocked VRAM. If it doesn't crash, you're stable. If it crashes or reboots your computer....


----------



## lordzed83

Falkentyne said:


> I don't believe this test is reliable.
> Testing this on my 3090
> 
> What I don't understand is that +600 mhz has no errors.in your test and is stable in games. (Core +135)
> But 700 mhz memory (Core +135) just crashes the driver in games / black screens / freezes then reboots the system in games, but 0 errors in your test.
> 
> So I think this test is not reliable.
> The memory overclock may depend on the core clock and the core being loaded in 3D as well.
> Because it is impossible to pass +135 / +700 in Heaven benchmark. The system crashes...but no errors in your test.


This is for 3080 atm Dont have 3090 to change values as this only allocates 8000mb's Have You read the testers file ?? Actual Video explaining how it works


I assume You know that CRASH can be due to not enough JUICE for +700 hehehe as i said only tested error detection. My 3080 tested in benchmarks can do +1000 NO CRASH.
This is not STABILITY TEST of gpu. This is to check if You are having memory errors that hits performance of ddr6x due to theirs correction


----------



## heavyarms1912

Anyone here with Gaming OC RTX 3080 facing issues with Thrm throttling because of GDDR6X temps? I am planning to change thermal pads and experiment and wanted inputs on the thickness of the stock thermal pads.


----------



## lordzed83

Falkentyne said:


> I don't believe this test is reliable.
> Testing this on my 3090
> 
> What I don't understand is that +600 mhz has no errors.in your test and is stable in games. (Core +135)
> But 700 mhz memory (Core +135) just crashes the driver in games / black screens / freezes then reboots the system in games, but 0 errors in your test.
> 
> So I think this test is not reliable.
> The memory overclock may depend on the core clock and the core being loaded in 3D as well.
> Because it is impossible to pass +135 / +700 in Heaven benchmark. The system crashes...but no errors in your test.
> 
> _Edit_ yep I just confirmed this test can't be used reliably.
> +135 / +700: 0 errors
> 
> +0 / +700: Heaven crashes the driver (Actually it rebooted the computer after 3 minutes of running! Checked windows event log--the driver didn't even recover or report a hardware event. It just hard locked and rebooted). So in this case it isn't even the core overclock. Core was stock +0. +700 RAM is just not stable.
> 
> Looping Heaven benchmark is still the best test for overclocked VRAM. If it doesn't crash, you're stable. If it crashes or reboots your computer....


Just to show ya +1200 on my 3080 in valley


----------



## Falkentyne

lordzed83 said:


> Just to show ya +1200 on my 3080 in valley


Yes I read both text files fully. I understand how to edit bat files. you know I used to do that in the MS DOS days...
I set memory usage to 13 GB manually. No errors in the log files.

But Heaven crashes. And as I said it's not the core overclock. And there is no such thing as changing memory voltage.
You also used Valley. Valley is easy to pass. It doesn't even use Tesselation. Try looping Heaven for 30 minutes.

But I guess all we can say is that no two cards are the same. But what exactly is the point --for GAMERS-- if you get 0 errors on your test but your computer crashes the driver or reboots when you play a game? That's not very helpful to us...
It may work for you but it's not going to do that for every system.


----------



## Zeakie

Amp holo 374w vbios zotac.rom


----------



## Huntkey

Zotac amp HOLO - Any chance you know what PCB it has ? I have ordered 3 of them and I want to order a water block for them .. it’s would be super helpful


----------



## zhrooms

Zeakie said:


> Amp holo 374w vbios zotac.rom


What is the default power limit? Feel free to upload the BIOS to TPU using GPU-Z, once successfully uploaded it'll show up here.


----------



## DirtyScrubz

EK sure is taking its sweet time releasing the Strix block. Speaking of which, the middle fan on my Strix gets a rattling sound at high rpm.


----------



## dr.Rafi

lordzed83 said:


> Just to show ya +1200 on my 3080 in valley


Iam Stable @ +1500 but performance drop after 1360


----------



## dr.Rafi

DirtyScrubz said:


> EK sure is taking its sweet time releasing the Strix block. Speaking of which, the middle fan on my Strix gets a rattling sound at high rpm.


Got Bykski quick shipping and is performing better than ek i had for 2080ti ,max tempreture 42 in crazy load,with shunt power unlocked and overclocked to max , memory stable @1500 the back plate even not warm to touch during long load mean all happy gpu,vram, powerstages.


----------



## dr.Rafi

ssgwright said:


> no need for congrats.. lol... I got scared of soldering so I'm running liquid metal. Not for long, I will do the soldering soon.
> 
> this was the best I could get before the shunt mod: http://www.3dmark.com/pr/484350 *12272*
> 
> has anyone seen an instance where a 30 series card has gone into safe mode? I used a lot of liquid metal (not so much that it could drip) but I used so much on all shunts that I assumed it would go into safe mode but it didn't.


No it wont, but in low load tests, low resolution and no antialiasing, it will start flectuate up and down and the power draw spikes to 700watt and then to 200 watt monitered on wall socket so not good to use it in low load games


----------



## dr.Rafi

Gears 5 performance with bikski water block , cpu on defult no overclock but system memory running 4266 dual rank ,gpu core hold on 2100 or up memory +1200, before with cpu waterblock on gpu max i got was 152 fps average and the cpu was clocked @ 5300 all cores, and memory 4400 with same subtiming.


----------



## dr.Rafi

PhoenixMDA said:


> Im waiting for my waterblock...befor 3weeks he had to arrived


EK?
honstly aliexpress ,chosed express shipping 3days was with China post,and last 2 days with dhl HongKong and Australia.
the quality is great and rgb looks great too .


----------



## dr.Rafi

honstly aliexpress ,chosed express shipping 3days was with China post,and last 2 days with dhl HongKong and Australia


----------



## Krisztias

Dear FE Owners,

did you got a waterblock already? If yes, wich one? Temps? If not, wich one you want to buy?
Thank you.


----------



## dr.Rafi

DStealth said:


> Chilled water ...can bench much higher 4750 CB15/20 but not pushing it hard as the card is not good...+500 Vram is reducing the scores max +200-300 for top FTW3 Ultra card...seems very bad.
> My memory is better @3840Mhz


Gold 3900x


----------



## Nizzen

dr.Rafi said:


> EK?
> honstly aliexpress ,chosed express shipping 3days was with China post,and last 2 days with dhl HongKong and Australia.
> the quality is great and rgb looks great too .


My Bykski 3090 strix block didn't fit 100%. Intefering with fanheader, so the pcb was bending. There was no instruction papers and not enough thermal pads. I guess it was because it is pretty cheap?
Block is cooling pretty good though...


----------



## Nizzen

dr.Rafi said:


> Cold 3900x


👨‍🔧🛠


----------



## Koby990

Can a RTX 3080 founders edition be flash mod? I ask due to the different power connector.


----------



## ssgwright

Koby990 said:


> Can a RTX 3080 founders edition be flash mod? I ask due to the different power connector.


yes but it won't do you much good


----------



## Koby990

You mean raising the power limit won't improve performance even if the card on water?


----------



## ssgwright

I've tried every bios there is.. no power boost... nothing, max pull for the tuff is 350w. from what I've seen from people with the FE it's the same


----------



## ssgwright

it will... if it worked... but flashing the bios doesn't seem to help to raise the power limit


----------



## Krisztias

ssgwright said:


> I've tried every bios there is.. no power boost... nothing, max pull for the tuff is 350w. from what I've seen from people with the FE it's the same


My FE pulls 369.7W in HWiNFO.


----------



## lordzed83

dr.Rafi said:


> Iam Stable @ +1500 but performance drop after 1360


Thats why I came up with thist software to see what amount of errors starts to have negative impact on actual gaming performance.


----------



## lordzed83

Falkentyne said:


> Yes I read both text files fully. I understand how to edit bat files. you know I used to do that in the MS DOS days...
> I set memory usage to 13 GB manually. No errors in the log files.
> 
> But Heaven crashes. And as I said it's not the core overclock. And there is no such thing as changing memory voltage.
> You also used Valley. Valley is easy to pass. It doesn't even use Tesselation. Try looping Heaven for 30 minutes.
> 
> But I guess all we can say is that no two cards are the same. But what exactly is the point --for GAMERS-- if you get 0 errors on your test but your computer crashes the driver or reboots when you play a game? That's not very helpful to us...
> It may work for you but it's not going to do that for every system.


Wel, i got no problem in port royale either ran it 10 minutes with +1400mhz on memories
Check what happens if ya drop core clock to 1900 lock it there and try push memory wounder what happens


----------



## lordzed83

ssgwright said:


> I've tried every bios there is.. no power boost... nothing, max pull for the tuff is 350w. from what I've seen from people with the FE it's the same


My 3080 FE bounces off 375 watts power limit on MSI AB


----------



## derthballs

Could anyone tell me what bios i could flash my Zotac OC to to increase the power limit from 100? Its a 2 pin card.


----------



## Nizzen

derthballs said:


> Could anyone tell me what bios i could flash my Zotac OC to to increase the power limit from 100? Its a 2 pin card.


Gigabyte 3080 gaming oc, or try everyone 

Post results here


----------



## derthballs

out of interest if you flash a bios and it doesnt work/cocks it up, do you have to blind flash back? And if so, how would i do that, i assume id need a secondary card in there or something? I flashed my 3090 no problem but i havent seen anyone with a zotak flashed in here from searching so a bit more hesitant about that.


----------



## LukeOverHere

Hey Guys, just incase anybody is on the fence, I managed to get my hand on a colorful igame RTX 3080 Advance OC Edition (Yes, it was very lucky, found a small shop that had one in stock) so i have cancelled my ASUS TUF OC I had on back order. I must say, I'm impressed with this card, the entire shroud is metal, everything on the card is metal (Its heavy af) and takes up a full three slots, but it is a decent card (Looks good as well imho). Got it home, Manually added +130Mhz to the core clock and added +600Mhz to the memory clock, pulled the power limit to 106%, and the max temp to 91 degrees (Not that it will ever reach that). I'm stable (5 hours of destiny 2 running at 4K (Literally everything on max) @ 2100Mhz Core Clock & 10,102Mhz Memory Clock, Absolute Max temp 71 degree's (Stays lower, but does not exceed 71 which is good). I'm not sure what everybody else is getting (well.... for the lucky few who have access to one of these) but this seems like a decent card!! Anybody else played with these Colorful iGame branded cards, anybody pushed a more aggressive overclock on this model? It appears to have a 400W Power Limit with 3 x 8 pin connectors which is probably why these cards clock quite well.


----------



## dr.Rafi

Nizzen said:


> My Bykski 3090 strix block didn't fit 100%. Intefering with fanheader, so the pcb was bending. There was no instruction papers and not enough thermal pads. I guess it was because it is pretty cheap?
> Block is cooling pretty good though...


These minor issues dont bother me there was plenty of thermal pads and hips of screws, but i used original thermal pads which have superior quality (thicker and softer) , for fan header is easy fix take the plastic of fan header out if possible and bend the the pins slitly to stop interfering, for me , if the fan header be of type that dont come out, i simply use hand held drimel to grind the the area that interfering with components, as long as i dont expose the water chamber of the block, they good quality mine fit perfectly though, and bikski still fresh company ,which i thank for their instock availabilty of all blocks of 3000 card family, i prefer cheap waterblock , live in Australia and for me is imppossible to sell the block after couple years , rarley gamers do custome watercooling, but graphic card itself i can sell for good price even after 3 years, BTW hows is your scores boosting clocks, with strix 3090 water cooled ?


----------



## dr.Rafi

derthballs said:


> out of interest if you flash a bios and it doesnt work/cocks it up, do you have to blind flash back? And if so, how would i do that, i assume id need a secondary card in there or something? I flashed my 3090 no problem but i havent seen anyone with a zotak flashed in here from searching so a bit more hesitant about that.


You can use any cheap old grapgic card as secondory to boot to flash the card back (un blindly).
btw Nvflash automoticaly flash your correct bios to the correct card even without choosing which card you want to flash ,i test it and it work even with 2080 ti as primary and 3080 secondory nvflash flashed the 3080 when i use 3080 family bios.


----------



## dr.Rafi

LukeOverHere said:


> Hey Guys, just incase anybody is on the fence, I managed to get my hand on a colorful igame RTX 3080 Advance OC Edition (Yes, it was very lucky, found a small shop that had one in stock) so i have cancelled my ASUS TUF OC I had on back order. I must say, I'm impressed with this card, the entire shroud is metal, everything on the card is metal (Its heavy af) and takes up a full three slots, but it is a decent card (Looks good as well imho). Got it home, Manually added +130Mhz to the core clock and added +600Mhz to the memory clock, pulled the power limit to 106%, and the max temp to 91 degrees (Not that it will ever reach that). I'm stable (5 hours of destiny 2 running at 4K (Literally everything on max) @ 2100Mhz Core Clock & 10,102Mhz Memory Clock, Absolute Max temp 71 degree's (Stays lower, but does not exceed 71 which is good). I'm not sure what everybody else is getting (well.... for the lucky few who have access to one of these) but this seems like a decent card!! Anybody else played with these Colorful iGame branded cards, anybody pushed a more aggressive overclock on this model? It appears to have a 400W Power Limit with 3 x 8 pin connectors which is probably why these cards clock quite well.


You Live in WA ?


----------



## ssgwright

welp I couldn't help myself this thing is just too cool... can't wait for this to get here: EK-QuantumX Delta TEC - Copper + Nickel

I'm just worried about it increasing the temps in my loop and affecting my 3080 oc


----------



## derthballs

dr.Rafi said:


> You can use any cheap old grapgic card as secondory to boot to flash the card back (un blindly).
> btw Nvflash automoticaly flash your correct bios to the correct card even without choosing which card you want to flash ,i test it and it work even with 2080 ti as primary and 3080 secondory nvflash flashed the 3080 when i use 3080 family bios.


Thank you, i flashed to the gigabyte one, still has a 100% limit rather than more, i take it this is usual for these cards? Got a bit more from the card though so happy with that for now.


----------



## derthballs

Nizzen said:


> Gigabyte 3080 gaming oc, or try everyone
> 
> Post results here


So was able to get a bit more out of the Zotak with this bios, boosting in games to 2055/2070, if i tried a 100mhz overclock on the zotak bios it was crashing, did a stability test with Port Royal and im all good, played a few hours of Valhalla and no problems. Thank you for the advice.


----------



## dr.Rafi

ssgwright said:


> I've tried every bios there is.. no power boost... nothing, max pull for the tuff is 350w. from what I've seen from people with the FE it's the same


Your card is shunted , do you componsate how much is pulling really ? With short direct solder or liquid metal shunt connection it should max to around 700 to 800 and try measure on wall socket .


----------



## dr.Rafi

If shows 350 mean it is more than 700


----------



## ssgwright

with gpu-z after shunt it says 260w max


----------



## LukeOverHere

dr.Rafi said:


> You Live in WA ?


Yes im in Perth, WA. You local as well?


----------



## dr.Rafi

ssgwright said:


> with gpu-z after shunt it says 260w max


So roughly 500 watt


----------



## dr.Rafi

LukeOverHere said:


> Yes im in Perth, WA. You local as well?


Yes mate, sydney i asked because colorfull cards very rare in our side and plenty on your side. How is weather too hot ?
With no aircon start droping my overclock setting every couple days getting warmer .


----------



## dr.Rafi

derthballs said:


> So was able to get a bit more out of the Zotak with this bios, boosting in games to 2055/2070, if i tried a 100mhz overclock on the zotak bios it was crashing, did a stability test with Port Royal and im all good, played a few hours of Valhalla and no problems. Thank you for the advice.


Btw pr dont give you agood stability indication try Heavy games .


----------



## VPII

dr.Rafi said:


> Btw pr dont give you agood stability indication try Heavy games .


Very good suggestion. Like with my MSI Gaming X Trio I can run benchmarks up to +165 to even +180mhz but for gaming I need to drop it to +135mhz.


----------



## ssgwright

VPII said:


> Very good suggestion. Like with my MSI Gaming X Trio I can run benchmarks up to +165 to even +180mhz but for gaming I need to drop it to +135mhz.


wow your numbers are almost exactly mine for benching and gaming with shunts


----------



## VPII

ssgwright said:


> wow your numbers are almost exactly mine for benching and gaming with shunts


No shunts for me, at present it is more about cooling it properly.


----------



## dr.Rafi

VPII said:


> Very good suggestion. Like with my MSI Gaming X Trio I can run benchmarks up to +165 to even +180mhz but for gaming I need to drop it to +135mhz.


Port royal dont load heavely the rasterization main core, mostly the RT cores ,and also not heavely load the vram and system cpu and ram , heavy game use more rasterization cores, vram , and rt cores if have raytracing feature, look at raytracing test in 3dmark is purely rtcores dependent so you can easly clock to 2190 + , in future i expect most cards will have mostly rt cores and game makers will implement them mainly in their games, and another point if you overclock the cpu to max and then the gpu to max ,then run the cpu and system on defult you will be able to push the gpu little more , and viseversa.


----------



## dr.Rafi

ssgwright said:


> wow your numbers are almost exactly mine for benching and gaming with shunts


number you add to core clock in after burner is different from bios to bios with Asus strix bios i can go up to 2070 in gpuz but in gigabyte master to 1950 in gpuz , but peformance is important and your boost clock while loading the gpu


----------



## VPII

dr.Rafi said:


> Port royal dont load heavely the rasterization main core, mostly the RT cores ,and also not heavely load the vram and system cpu and ram , heavy game use more rasterization cores, vram , and rt cores if have raytracing feature, look at raytracing test in 3dmark is purely rtcores dependent so you can easly clock to 2190 + , in future i expect most cards will have mostly rt cores and game makers will implement them mainly in their games, and another point if you overclock the cpu to max and then the gpu to max ,then run the cpu and system on defult you will be able to push the gpu little more , and viseversa.


Yup, the ray tracting feature test is a joke actually. I mean where my card can run all games up to 2130 to 2145mhz I can run the Ray Tracing feature test at 2205 to even 2220mhz.


----------



## dr.Rafi

VPII said:


> Yup, the ray tracting feature test is a joke actually. I mean where my card can run all games up to 2130 to 2145mhz I can run the Ray Tracing feature test at 2205 to even 2220mhz.


its not joke itis future .


----------



## derthballs

dr.Rafi said:


> Btw pr dont give you agood stability indication try Heavy games .


Yeah, i worked that out, could easy do 200mhz core on PR but that would crash within 5 mins of Valhalla and Doom, backed off to +100 which still boosts up to 2070 max and usually sits about 2055 so im pretty happy with that. I played Valhalla for 3 hours last night without any crashes.


----------



## arrow0309

Hiya, as of today which bios do ya lads recommend for my X Trio, the FTW3 Ultra or the Strix One?


----------



## saar

hi i got zotac trinity oc edition i use tuf oc bios its the only bios i checked. 
in games i can run the core 2130-2145 , 800 for the memory i think i can go higher on memory just didnt check it yet.
with zotac oc bios i cant get this numbers.


----------



## derthballs

saar said:


> hi i got zotac trinity oc edition i use tuf oc bios its the only bios i checked.
> in games i can run the core 2130-2145 , 800 for the memory i think i can go higher on memory just didnt check it yet.
> with zotac oc bios i cant get this numbers.


Thanks, does that boost go over 100%?


----------



## VPII

arrow0309 said:


> Hiya, as of today which bios do ya lads recommend for my X Trio, the FTW3 Ultra or the Strix One?


I tried both those bioses on my X Trio and did not help all that much, it actually increased temps by 2 to 3c.


----------



## saar

derthballs said:


> Thanks, does that boost go over 100%?


106-109%


----------



## derthballs

saar said:


> 106-109%


Thanks - one last thing, did your inputs work as normal with that bios on the back of the card?

Edit: ignore that, i just tried it, working fine, thanks again for your help.


----------



## saar

sorry idont understand my english not very well. 😌


----------



## Miro75

Hi... I shunted 3080 Ventus but still hitting the power limit. 5mOhm, 5 resistors on the top + PCIe on the bottom. And... nothing look on pic. Any idea what is wrong?


----------



## Falkentyne

Miro75 said:


> Hi... I shunted 3080 Ventus but still hitting the power limit. 5mOhm, 5 resistors on the top + PCIe on the bottom. And... nothing look on pic. Any idea what is wrong?
> View attachment 2465422


MVDCC should not be 58w. That is WAY too high. With 1.33x multiplier in hwinfo64 for my mod (GPU-Z does not have multipliers), I get 26.4W after multiplier at 503W power draw (after multiplier). Your MVDCC should be reporting 18W on yours before multiplier (I think your multiplier value is *2, which GPU-Z does not support changing).
If a 2x multiplier (5 mOhm on top of 5 mOhm) were applied, that would put your MVDCC at 116W!

And your base PCIE slot power is reporting really really low. Even if you double the board power draw (2x multiplier), that's still only 41W from PCIE. Something is wrong with the mod.

Check your shunt mod. How did you mod it? Did you solder? Did you use conductive paint?

Did you scrape off the conformal coating with a small flat blade screwdriver from the edges of the shunts, until it was bright silver showing?

On your original shunts, was the 'silver' part lower in height than the black middle package? Or was the original shunt completely flat (silver edges at the same height as the black middle?)

Here is what your power draw should be. Mine is with 15 mOhm conductive paint mod. GPU-Z are the stock values, HWinfo are the corrected values.


----------



## Miro75

Falkentyne said:


> MVDCC should not be 58w. That is WAY too high. With 1.33x multiplier in hwinfo64 for my mod (GPU-Z does not have multipliers), I get 26.4W after multiplier at 503W power draw (after multiplier). Your MVDCC should be reporting 18W on yours before multiplier (I think your multiplier value is *2, which GPU-Z does not support changing).
> If a 2x multiplier (5 mOhm on top of 5 mOhm) were applied, that would put your MVDCC at 116W!
> 
> And your base PCIE slot power is reporting really really low. Even if you double the board power draw (2x multiplier), that's still only 41W from PCIE. Something is wrong with the mod.
> 
> Check your shunt mod. How did you mod it? Did you solder? Did you use conductive paint?
> 
> Did you scrape off the conformal coating with a small flat blade screwdriver from the edges of the shunts, until it was bright silver showing?
> 
> On your original shunts, was the 'silver' part lower in height than the black middle package? Or was the original shunt completely flat (silver edges at the same height as the black middle?)
> 
> Here is what your power draw should be. Mine is with 15 mOhm conductive paint mod. GPU-Z are the stock values, HWinfo are the corrected values.
> 
> View attachment 2465425


All resistors soldered and I'm sure about reliable joints. I scrapped conformal coating. Have you shunted all 5 resistors on the top? 

Look on that: stock values, before shunt. PCI-e power draw is very low...


----------



## Stash

How much vdroop is normal? My #2 connector always hits 11.9v on full load; I'm not sure if swapping to three separate cables would fix that as it seems to imply a PSU issue (850W though). Is anyone running a 3x8 pin card with two type 4 cables and still holding >12v on all 3 connectors at full TDP?


----------



## Falkentyne

Miro75 said:


> All resistors soldered and I'm sure about reliable joints. I scrapped conformal coating. Have you shunted all 5 resistors on the top?
> 
> Look on that: stock values, before shunt. PCI-e power draw is very low...
> View attachment 2465426


I have a Founder's Edition 3090. I painted over the 6 main shunts (there are three on front, three on back).
There are three "micro 005 shunts" that are impossible to stack a shunt on (at least not 2512 size shunts), and one of those connects to PCIE slot, one of those connects to something (chip?) and the third connects to ......the RGB cable! Needless to say those are left alone!

Take a look at these two posts. Granted both are 3090's but I have no idea what's up with the mvdcc reporting.

This one is olrdtg's modded 3090 FE with 3 mOhm resistors replacing the stock 5 mOhms. Multiply the values by 1.67 to get the real watts.

You can see here that mvdcc is much lower than chip power draw.
although this was taken at load. His MVDCC is 32W, multiply it by 1.67 to get the real MVDCC (53.44W)









RTX 3090 Founders Edition working shunt mod


Yes idea would be to stack them. Should calculate that later in the tool....




www.overclock.net





The next link is a reference PNY 3090, unmodded. Motivman's card.

He selected the max PCIE power draw and 8 pins in GPU-Z.
I do see that somehow he's drawing 70 watts through MVDCC by default at idle.

That doesn't make sense at all. His GPU chip power draw is LOWER than his MVDCC.
He's drawing more power through MVDCC at idle than olrdtg is at load!
If you multiply olrdtg's values by 1.67 in the above link, (606W board power, 53.44W MVDCC), that's still less than what is in the bottom link.









RTX 3090 Founders Edition working shunt mod


Please post a screenshot of your wattage rails in HWinfo64, doing a full load test when you are hitting the power limit hard. Then we will know what shunts you should use. It depends on the ratio of slot power to single 8 pin power. It seems like Asus and Founder's Edition cards use a smaller...




www.overclock.net





I have frankly no idea whats up here. Seems like Founder's Edition cards are reporting much lower mvdcc than AIB cards?

I took a screenshot of my 'Uncorrected' values at idle with voltage locked at 1.10v, to keep the card in full speed clocks. (You can multiply these values by 1.33x - 1.36x to get real values).


----------



## Falkentyne

Stash said:


> How much vdroop is normal? My #2 connector always hits 11.9v on full load; I'm not sure if swapping to three separate cables would fix that as it seems to imply a PSU issue (850W though). Is anyone running a 3x8 pin card with two type 4 cables and still holding >12v on all 3 connectors at full TDP?


12v droop is a function of the power supply regulating the 12v, the resistance drop across any connectors, and any resistance drop once it reaches the GPU
Post a picture of your hwinfo64 12v pin readings. It will show both the PCIE 12 pin voltages that the GPU is getting from the cable and what the GPU is actually using.


----------



## Stash

Falkentyne said:


> 12v droop is a function of the power supply regulating the 12v, the resistance drop across any connectors, and any resistance drop once it reaches the GPU
> Post a picture of your hwinfo64 12v pin readings. It will show both the PCIE 12 pin voltages that the GPU is getting from the cable and what the GPU is actually using.


At 102% TDP.
Note: 8-pin #1 is a sole type 4 (6+2) from the PSU whereas 8-pin #2 & #3 are two 6+2 connectors from the same cable.


----------



## Falkentyne

Stash said:


> At 102% TDP.
> Note: 8-pin #1 is a sole type 4 (6+2) from the PSU whereas 8-pin #2 & #3 are two 6+2 connectors from the same cable.


This is PSU related.
What is the 12v reading coming from your motherboard sensors (it's farther up). That is part of the 24 pin main cable.
Mine is 12.264 idle, 12.208v load. It's a Seasonic Prime PX-1000 . I am also using the Seasonic 12 pin cable instead of the FE adapter (this does not affect the motherboard 12v reading however, but it did give me another +15 mhz on the core!!)

Here are my voltage rails with my 3090 FE sweating at 530W and the heatsink not able to cope with that kind of power draw very well.
(I had the MSI afterburner voltage slider set to 100%, which allows the boost algorithm to use the 1.10v voltage tier on the curve "when it wants to".


----------



## Stash

Falkentyne said:


> This is PSU related.
> What is the 12v reading coming from your motherboard sensors (it's farther up). That is part of the 24 pin main cable.
> Mine is 12.264 idle, 12.208v load. It's a Seasonic Prime PX-1000 . I am also using the Seasonic 12 pin cable instead of the FE adapter (this does not affect the motherboard 12v reading however, but it did give me another +15 mhz on the core!!)


That was my first suspicion; the 12v rail, according to the motherboard sensors, seems stable at around 12.168v idle/average, dropping to 12.024v under full load. It's a relatively new RM850x and hasn't caused me grief in the past. I'm willing to consider the cable being an issue but I think given the voltage disparity it is likely the PSU being at capacity or faulty.


----------



## Falkentyne

Stash said:


> That was my first suspicion; the 12v rail, according to the motherboard sensors, seems stable at around 12.168v idle/average, dropping to 12.024v under full load. It's a relatively new RM850x and hasn't caused me grief in the past. I'm willing to consider the cable being an issue but I think given the voltage disparity it is likely the PSU being at capacity or faulty.


There's nothing wrong with your rails. 11.9v isn't low. 
If you added 200mv, yours would match up with mine.
Seasonic is known for strong 12v.


----------



## DOOOLY

Well I received EK 3080 Strix water block but still waiting for a card 😥Here some pictures


----------



## olrdtg

Miro75 said:


> Hi... I shunted 3080 Ventus but still hitting the power limit. 5mOhm, 5 resistors on the top + PCIe on the bottom. And... nothing look on pic. Any idea what is wrong?


As Falkentyne said, your MVDDC is way too high for having a correct shunt mod installation. First of all, what are you using to benchmark? If you are using Furmark -- don't. It acts like a darn power virus and can give some whacked out numbers in GPU-Z as it literally pushes the card to the red-line even when shunted.
Second, would you mind taking some clear pictures of both sides of your PCB + a shot of all the shunt modded resistors? I need to get a clear look at the PCB and resistors to be of any help.

When running a benchmark like Heaven or Superposition, GPU-Z should look sorta like the screenshot above or this one below:










Edit: took the last screenshot of GPU-Z too late after closing benchmark. Started heaven back up and got a better screenshot.


----------



## Stash

Falkentyne said:


> There's nothing wrong with your rails. 11.9v isn't low.
> If you added 200mv, yours would match up with mine.
> Seasonic is known for strong 12v.


Alright, in that case the vdroop on #2 is likely related to the cable configuration then? I'm gonna swap over to 3 separate type 4s tomorrow and I'll see if it helps normalise the 3x8s.


----------



## Falkentyne

olrdtg said:


> As Falkentyne said, your MVDDC is way too high for having a correct shunt mod installation. First of all, what are you using to benchmark? If you are using Furmark -- don't. It acts like a darn power virus and can give some whacked out numbers in GPU-Z as it literally pushes the card to the red-line even when shunted.
> Second, would you mind taking some clear pictures of both sides of your PCB + a shot of all the shunt modded resistors? I need to get a clear look at the PCB and resistors to be of any help.
> 
> When running a benchmark like Heaven or Superposition, GPU-Z should look sorta like the screenshot above or this one below:
> 
> View attachment 2465444
> 
> 
> Edit: took the last screenshot of GPU-Z too late after closing benchmark. Started heaven back up and got a better screenshot.


In his screenshot of the pre-modded board, he is drawing 100W of MVDCC! How is that even possible?








[Official] NVIDIA RTX 3080 Owner's Club


MVDCC should not be 58w. That is WAY too high. With 1.33x multiplier in hwinfo64 for my mod (GPU-Z does not have multipliers), I get 26.4W after multiplier at 503W power draw (after multiplier). Your MVDCC should be reporting 18W on yours before multiplier (I think your multiplier value is...




www.overclock.net





The fact that his mvdcc did reduce seems to imply that his shunt mod worked. But why is his PCIE so low and his MVDCC so high to begin with?

I think we need some close up pictures of both sides of his PCB. Something else is limiting power because all of his rails did seem to go down by about 50%.


----------



## motivman

Falkentyne said:


> I have a Founder's Edition 3090. I painted over the 6 main shunts (there are three on front, three on back).
> There are three "micro 005 shunts" that are impossible to stack a shunt on (at least not 2512 size shunts), and one of those connects to PCIE slot, one of those connects to something (chip?) and the third connects to ......the RGB cable! Needless to say those are left alone!
> 
> Take a look at these two posts. Granted both are 3090's but I have no idea what's up with the mvdcc reporting.
> 
> This one is olrdtg's modded 3090 FE with 3 mOhm resistors replacing the stock 5 mOhms. Multiply the values by 1.67 to get the real watts.
> 
> You can see here that mvdcc is much lower than chip power draw.
> although this was taken at load. His MVDCC is 32W, multiply it by 1.67 to get the real MVDCC (53.44W)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> RTX 3090 Founders Edition working shunt mod
> 
> 
> Yes idea would be to stack them. Should calculate that later in the tool....
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> The next link is a reference PNY 3090, unmodded. Motivman's card.
> 
> He selected the max PCIE power draw and 8 pins in GPU-Z.
> I do see that somehow he's drawing 70 watts through MVDCC by default at idle.
> 
> That doesn't make sense at all. His GPU chip power draw is LOWER than his MVDCC.
> He's drawing more power through MVDCC at idle than olrdtg is at load!
> If you multiply olrdtg's values by 1.67 in the above link, (606W board power, 53.44W MVDCC), that's still less than what is in the bottom link.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> RTX 3090 Founders Edition working shunt mod
> 
> 
> Please post a screenshot of your wattage rails in HWinfo64, doing a full load test when you are hitting the power limit hard. Then we will know what shunts you should use. It depends on the ratio of slot power to single 8 pin power. It seems like Asus and Founder's Edition cards use a smaller...
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> I have frankly no idea whats up here. Seems like Founder's Edition cards are reporting much lower mvdcc than AIB cards?
> 
> I took a screenshot of my 'Uncorrected' values at idle with voltage locked at 1.10v, to keep the card in full speed clocks. (You can multiply these values by 1.33x - 1.36x to get real values).
> 
> View attachment 2465430


I have a reason why my card seems to be drawing that much MVDCC at idle.. for some reason, my card decides to run 1755mhz at idle, not sure why at the moment, tryna figure out what program in my PC is causing it....


----------



## Falkentyne

motivman said:


> I have a reason why my card seems to be drawing that much MVDCC at idle.. for some reason, my card decides to run 1755mhz at idle, not sure why at the moment, tryna figure out what program in my PC is causing it....
> View attachment 2465446


NVCP set to "prefer max performance" will cause that. But that won't cause high memory wattage.
You can look in task manager to see if anything is using the GPU.
What is the GPU Load %? It's probably higher up in GPU-Z.


----------



## Miro75

All screenshot done in Fumark. Will try other benchmark

Edit: Heaven, 3DMark: same story. Max board power draw is 200W. Shunt 0.005Ohm. It means 400W in real.

Will post pics of the board. I think Ventus is strongly limited somehow.


----------



## motivman

Falkentyne said:


> NVCP set to "prefer max performance" will cause that. But that won't cause high memory wattage.
> You can look in task manager to see if anything is using the GPU.
> What is the GPU Load %? It's probably higher up in GPU-Z.


oh yeah, I remembered I set max performance because I was having issues with Marvels avengers and the game stuttering. restored nvidia control panel to default...and this is what i get, looks better now.


----------



## Falkentyne

Miro75 said:


> All screenshot done in Fumark. Will try other benchmark


Whoa...you tested furmark??
Furmark automatically power throttles cards to base clocks in the driver! This has been done for years ever since people were blowing up their VRM's with Furmark in the X800 XT / 8800 GTX days....


----------



## Falkentyne

motivman said:


> oh yeah, I remembered I set max performance because I was having issues with Marvels avengers and the game stuttering. restored nvidia control panel to default...and this is what i get, looks better now.
> 
> View attachment 2465452


That looks much better. Also if you have a problem with games like that with oscillating core clocks at low load causing stuttering, just lock a voltage point on the V/F curve in Afterburner and save it as a profile. Problem solved.


----------



## Miro75

Falkentyne said:


> Whoa...you tested furmark??
> Furmark automatically power throttles cards to base clocks in the driver! This has been done for years ever since people were blowing up their VRM's with Furmark in the X800 XT / 8800 GTX days....


But... VR, 3Dmark, Heaven, Superimposition: all of them hitting the power limit. I spent more then day to sort it out by myself... no idea what throthle this f....ng Ventus. Any idea what could be wrong...


----------



## motivman

Falkentyne said:


> That looks much better. Also if you have a problem with games like that with oscillating core clocks at low load causing stuttering, just lock a voltage point on the V/F curve in Afterburner and save it as a profile. Problem solved.


thanks for the tip man, you are a great resource!


----------



## Miro75

Pics of my Ventus


----------



## Falkentyne

Miro75 said:


> But... VR, 3Dmark, Heaven, Superimposition: all of them hitting the power limit. I spent more then day to sort it out by myself... no idea what throthle this f....ng Ventus. Any idea what could be wrong...


Can you please take a high resolution picture of both sides of the board? Be careful with the thermal pads when you disassemble the board. Perhaps there is something else that must be shunted or there is some weird protection going on. @olrdtg will have to help you with this. I have absolutely no experience in such matters. Are there any fuses next to the shunts? A very clear picture of the PCB would tell us.

_edit_ looks like you replied while I was typing 

You stacked 5 mOhm shunts, which reduces reported power draw by 50%, and the board is power throttling at "half" power (meaning it's still using the same amount of power as before?)


----------



## VladimirAG

I need more watercoooling 



















https://www.3dmark.com/fs/24020282


----------



## Falkentyne

@Miro75 What are all those weird marks around the shunts? It almost looks like the shunts are have exposed metal connecting them to something.
It looks almost like one shunt has some strange 'solder string' attaching it to some tiny SMD right next to it.

And why are there strange marks around the 8 pin shunts? Is that Flux ?
@olrdtg Is it me or does that PCB look like it was soldered at 4:55 PM on a Friday afternoon ?


----------



## Miro75

@Falkentyne it's flux... however I desoldered shunts and GPU works well, no issues. Power limit 320W like by the design. Sametime, without shunts Ventus has power limit 320W doesnt matter which BIOS is crossflashed. I tried many BIOSes (literally speaking all available). And the power limit is sticked to the stock one... 320W despite of all.

I tried two different shunts 5 and 10mOhms. I both cases the power limit is around 384W. In case of 10mOhms power limit kicks-off at 80% of TDP (1.667×320×0.8=384W)). In case of 5 mOhms at 60% (2.0x320x0.6=384W). It seems that there is a magic sensor which doesnt permit to go above 384W. Interestingly - the one and only BIOS which helps me to go above 384W was Aorus Master. About +15...20% more... but this BIOS disabled all of my display ports except the one.


----------



## Falkentyne

Miro75 said:


> @Falkentyne it's flux... however I desoldered shunts and GPU works well, no issues. Power limit 320W like by the design. Sametime, without shunts Ventus has power limit 320W doesnt matter which BIOS is crossflashed. I tried many BIOSes (literally speaking all available). And the power limit is sticked to the stock one... 320W despite of all.
> 
> I tried two different shunts 5 and 10mOhms. I both cases the power limit is around 384W. In case of 10mOhms power limit kicks-off at 80% of TDP (1.667×320×0.8=384W)). In case of 5 mOhms at 60% (2.0x320x0.6=384W). It seems that there is a magic sensor which doesnt permit to go above 384W. Interestingly - the one and only BIOS which helps me to go above 384W was Aorus Master. About +15...20% more... but this BIOS disabled all of my display ports except the one.


Wow, I hope @olrdtg can help you with this. Maybe even @chispy
Something is definitely limiting power. Usually in past cases it's PCI Express not being shunted. But yours is...

The one completely bizarre thing about your board is the MVDCC power draw. You posted stock results before mod, but MVDCC was reporting 100W+! I don't even think it's possible for GDDR6X to draw 100W. If each chip is drawing 2 watts, that would be 20W on a 3080 and 42W on a 3090. Yet you're pulling 100W...

However all your power rails got reduced reporting from your mod, including MVDCC. But a 384W power cap...that has to be coming from someplace...this foe is beyond me.


----------



## LukeOverHere

dr.Rafi said:


> Yes mate, sydney i asked because colorfull cards very rare in our side and plenty on your side. How is weather too hot ?
> With no aircon start droping my overclock setting every couple days getting warmer .


Hey Mate,
I agree, seem to be a very reasonable card to grab in WA, and they are in stock long before the other brands.
It’s currently raining here in Perth, and it was raining yesterday as well, so definitely not hot at the moment haha.
Out of interest, What 3080 model are you running?


----------



## ssgwright

I'm also getting about a 140w power draw by the MVDDC on my TUF


----------



## olrdtg

Miro75 said:


> @Falkentyne it's flux... however I desoldered shunts and GPU works well, no issues. Power limit 320W like by the design. Sametime, without shunts Ventus has power limit 320W doesnt matter which BIOS is crossflashed. I tried many BIOSes (literally speaking all available). And the power limit is sticked to the stock one... 320W despite of all.
> 
> I tried two different shunts 5 and 10mOhms. I both cases the power limit is around 384W. In case of 10mOhms power limit kicks-off at 80% of TDP (1.667×320×0.8=384W)). In case of 5 mOhms at 60% (2.0x320x0.6=384W). It seems that there is a magic sensor which doesnt permit to go above 384W. Interestingly - the one and only BIOS which helps me to go above 384W was Aorus Master. About +15...20% more... but this BIOS disabled all of my display ports except the one.





Falkentyne said:


> Wow, I hope @olrdtg can help you with this. Maybe even @chispy
> Something is definitely limiting power. Usually in past cases it's PCI Express not being shunted. But yours is...
> 
> The one completely bizarre thing about your board is the MVDCC power draw. You posted stock results before mod, but MVDCC was reporting 100W+! I don't even think it's possible for GDDR6X to draw 100W. If each chip is drawing 2 watts, that would be 20W on a 3080 and 42W on a 3090. Yet you're pulling 100W...
> 
> However all your power rails got reduced reporting from your mod, including MVDCC. But a 384W power cap...that has to be coming from someplace...this foe is beyond me.


Miro75, I looked at the pics and your shunt mods look fine from the surface. There are 2 small shunt resistors on the 3080 and 3090 FE boards (also on reference PCBs) that are directly connected to the larger shunt resistors (the ones you modded). I modded the two small ones on my 3090 FE the other day to see if it'd make any difference, but it really didn't I'm wondering if some of these reference cards have these tiny shunt resistors has any baring on the mod. Unfortunately I don't have a Ventus 3080 to experiment with, so I couldn't really tell you which shunt goes where. My best guess here is that one of the resistors was improperly installed or is bad. It's possible to have a bad shunt resistor, if you have any fresh ones try using those. Also, I'm not sure if the Ventus has the fuse connected to the PCIe shunt resistor, if it does you may want to use a 20mOhm resistor on the PCIe one, but again, I don't know if it has the fuse or not.

Another possibility is that the readings in GPU-Z are just completely out of whack because the card can't get a real grasp on what the power levels are. If I can ever get my hands on a Ventus 3080 to shunt mod, I'll do some experimentation.

Falkentyne -- MVDDC could be that high. IIRC mine was around 80 ~ 100W stock. Search for GPU-Z screenshots of the 3080, Videocardz put one up that shows 75W under a medium load 🤷‍♂️ However, when shunt modded, his MVDDC should be MUCH lower. Somewhere around like 30 to 40W depending.

Edit: I went back and looked at an old GPU-Z log from before I modded my card. During my benchmarks the most mine drew over MVDDC was 83 W, and that was during a peak. It's average was like 76 W

Hey @bmgjet -- if you see this, just out of curiosity, does the Ventus have the PCIe fuse?


----------



## Adrian76

Falkentyne said:


> This is PSU related.
> What is the 12v reading coming from your motherboard sensors (it's farther up). That is part of the 24 pin main cable.
> Mine is 12.264 idle, 12.208v load. It's a Seasonic Prime PX-1000 . I am also using the Seasonic 12 pin cable instead of the FE adapter (this does not affect the motherboard 12v reading however, but it did give me another +15 mhz on the core!!)
> 
> Here are my voltage rails with my 3090 FE sweating at 530W and the heatsink not able to cope with that kind of power draw very well.
> (I had the MSI afterburner voltage slider set to 100%, which allows the boost algorithm to use the 1.10v voltage tier on the curve "when it wants to".
> 
> View attachment 2465432


I'm sorry but you're wrong, 12v ATX spec says +/- 5% for 12V, My Seasonic drops 11.79v on the 12v rails it can drop to 11.4 however i'd be more looking at your system because it's nearly overvolting, 12.6 max spec and I am seeing yours spiking to 12.5V.


----------



## Falkentyne

Adrian76 said:


> I'm sorry but you're wrong, 12v ATX spec says +/- 5% for 12V, My Seasonic drops 11.79v on the 12v rails it can drop to 11.4 however i'd be more looking at your system because it's nearly overvolting, 12.6 max spec and I am seeing yours spiking to 12.5V.


I'm sorry but I'm not wrong.
My r9 290X started _black screening_ when the 12v dropped down to 11.45v.
When I replaced the PCIE cable so it was a solid 11.75v, it stopped black screening.

A healthy 12v is very highly desired by overclockers. Old PC Power and Cooling PSU's even had an adjustable 12v rail potentiometer for just this purpose.

I don't know what's up with all these begginers coming on this forum acting like they know what they're talking about when some people here have been overclocking since the Pentium MMX days...

Please do your research, Adrian.


----------



## VPII

I pretty happy finally broke 19K in Time Spy. The cpu score is not as high as it should be. I found that to get the highest cpu score I first need to run Time Spy cpu test only which would be around 17K and then run the full test. Ambient temps this side is not the best, but trying to cool the gpu as best possible with added fans blowing into card which helps.



https://www.3dmark.com/spy/15292190


----------



## Adrian76

Falkentyne said:


> I'm sorry but I'm not wrong.
> My r9 290X started _black screening_ when the 12v dropped down to 11.45v.
> When I replaced the PCIE cable so it was a solid 11.75v, it stopped black screening.
> 
> A healthy 12v is very highly desired by overclockers. Old PC Power and Cooling PSU's even had an adjustable 12v rail potentiometer for just this purpose.
> 
> I don't know what's up with all these begginers coming on this forum acting like they know what they're talking about when some people here have been overclocking since the Pentium MMX days...
> 
> Please do your research, Adrian.


With all respect previous AMD cards were not the best stable graphics cards anyway, I know I had 7970's.

I have been overclocking since the Cyrix 166 days so I am not a beginner but if old PSU's had manual regulation it means they had crap auto regulation.

My research is that a good PSU manufacturer will tell you +/- 5% tolerance is acceptable but cheapo's would probably say +/- 10% is acceptable which is not true by the way.

I was merely just stating you told this poster it was his PSU because his readings were showing 11.8+V which is well within tolerance, It's barely 1.5%.


----------



## Koby990

Koby990 said:


> Can a RTX 3080 founders edition be flash mod? I ask due to the different power connector.
> [/QUOTE


So, ummm witch bios do you guys recommend? 
I should receive my block soon.


----------



## dr.Rafi

Miro75 said:


> @Falkentyne it's flux... however I desoldered shunts and GPU works well, no issues. Power limit 320W like by the design. Sametime, without shunts Ventus has power limit 320W doesnt matter which BIOS is crossflashed. I tried many BIOSes (literally speaking all available). And the power limit is sticked to the stock one... 320W despite of all.
> 
> I tried two different shunts 5 and 10mOhms. I both cases the power limit is around 384W. In case of 10mOhms power limit kicks-off at 80% of TDP (1.667×320×0.8=384W)). In case of 5 mOhms at 60% (2.0x320x0.6=384W). It seems that there is a magic sensor which doesnt permit to go above 384W. Interestingly - the one and only BIOS which helps me to go above 384W was Aorus Master. About +15...20% more... but this BIOS disabled all of my display ports except the one.


As i explained to you my friend your card is using high power and you gpu z picture of sensors, showing you not loading the gpu with heavy load, its boosting @1800+its ok to get power limit showing onscreen, try to mesure wall socket power before and after mode so you can measure the actual power consumption.


----------



## dr.Rafi

Miro75 said:


> @Falkentyne it's flux... however I desoldered shunts and GPU works well, no issues. Power limit 320W like by the design. Sametime, without shunts Ventus has power limit 320W doesnt matter which BIOS is crossflashed. I tried many BIOSes (literally speaking all available). And the power limit is sticked to the stock one... 320W despite of all.
> 
> I tried two different shunts 5 and 10mOhms. I both cases the power limit is around 384W. In case of 10mOhms power limit kicks-off at 80% of TDP (1.667×320×0.8=384W)). In case of 5 mOhms at 60% (2.0x320x0.6=384W). It seems that there is a magic sensor which doesnt permit to go above 384W. Interestingly - the one and only BIOS which helps me to go above 384W was Aorus Master. About +15...20% more... but this BIOS disabled all of my display ports except the one.


On which bios you getting that 384 watt after shunt?
i replied then find the answer, the tdp is also wrong reading is just reading the percentage of the bios you flashed on max tdp it is same for me i have 0.010 ohm now on and kicking power limit @ 80 % ,dont worry for tdp reading your 0.010 shunt is maxing @ 80 % tdp = 80% of your installed bios wattage= 270 watt (reading in gpuz) (this all gpuz reading), but the real usage is your wattage read in gpuz x 1.66 = 445 watt , so your shunting new power consumption is kicking power limit , and also you really consuming 80% tdp x 1.66 *=132 %* tdp. i hope is clear , as i explained i measured on wall socket and all my explanation is correct, sorry i know my english suck .
And to confirm my theory short all the shunts you will never see kicking power limit ,but you drawing insane power in some loads, i try it it wont do any damage , but i had current protection reset was kicking from power supply .


----------



## LukeOverHere

aayman_farzand said:


> Got myself a Colorful iGame 3080 Advanced. Not one of the usual brands but it was the only thing I could secure. Card works great and looks fantastic. Minimalistic + some tasteful lighting to remind me its for gaming.
> 
> I think this is the cheapest 3x8pin card, but I'm sure its power phases are not similar to Strix/FTW3. How are you guys testing your boosts? 3DMark only or something more?


I ended up buying a iGame 3080 Advance OC as well, very solid card, and the build quality is fantastic. I did mention this earlier in my posts, but today I pushed it a little more than previously mentioned, +135 on the core clock, +700 on the memory, Afterburner is still reading 2100Mhz Max for the core, but 10202Mhz for the memory, which I still think is excellent. The bios is also 400W for these cards which is a bonus. Out of interest, have you tried flashing a different BIOS with a 450W Rating? The ASUS or EVGA bios? I cant find any solid info to say whether I could safely flash one of the higher watt bios.


----------



## Miro75

Seems


olrdtg said:


> Miro75, I looked at the pics and your shunt mods look fine from the surface. There are 2 small shunt resistors on the 3080 and 3090 FE boards (also on reference PCBs) that are directly connected to the larger shunt resistors (the ones you modded). I modded the two small ones on my 3090 FE the other day to see if it'd make any difference, but it really didn't I'm wondering if some of these reference cards have these tiny shunt resistors has any baring on the mod. Unfortunately I don't have a Ventus 3080 to experiment with, so I couldn't really tell you which shunt goes where. My best guess here is that one of the resistors was improperly installed or is bad. It's possible to have a bad shunt resistor, if you have any fresh ones try using those. Also, I'm not sure if the Ventus has the fuse connected to the PCIe shunt resistor, if it does you may want to use a 20mOhm resistor on the PCIe one, but again, I don't know if it has the fuse or not.
> 
> Another possibility is that the readings in GPU-Z are just completely out of whack because the card can't get a real grasp on what the power levels are. If I can ever get my hands on a Ventus 3080 to shunt mod, I'll do some experimentation.
> 
> Falkentyne -- MVDDC could be that high. IIRC mine was around 80 ~ 100W stock. Search for GPU-Z screenshots of the 3080, Videocardz put one up that shows 75W under a medium load 🤷‍♂️ However, when shunt modded, his MVDDC should be MUCH lower. Somewhere around like 30 to 40W depending.
> 
> Edit: I went back and looked at an old GPU-Z log from before I modded my card. During my benchmarks the most mine drew over MVDDC was 83 W, and that was during a peak. It's average was like 76 W
> 
> Hey @bmgjet -- if you see this, just out of curiosity, does the Ventus have the PCIe fuse?



I've done an experiment. Shunted PCIe with 0.0025mOhm (5+5). All remaining shunts with 5mOhm. What's happened?
PCIe power draw reading dropped from 23W (5mOhm shunt) to 16W (2.5 mOhm shunt)
2x8 pin power draw increased from 83 to 90
The total power draw (measured by kilowatt meter) increased by ca. 30W.

Ventus has definitely weird ratio between PCie/8pin power draw. On stock almost 280W comes from 2x8pin and only 30W from slot. Look on that, stock GPUz readings before shunt mod. 311W total power draw, 31W comes from slot. 280W from 8pins sockets.









Ventus is a strange card.. Any ideas? I'm open to check any good solution, if proposed. Thanks in advance


----------



## techenth

Couldn't find a Strix or Gaming Trio so got a TUF 3080 OC instead. Stable at +270/700 benchs, +200/700 in games. 90% fan speed, temps around 60c. The card is dying for more power. Hope we can get a custom bios soon. The below is from Timespy without the shunt mod.











Should I return the card and chase a Strix OC? Or should I be contempt with what I got


----------



## LukeOverHere

Some very strange results i have had recently with my Colorful iGame 3080 Advance OC Edition.

As a reference point, the stock Colorful BIOS is 400W, i have actually down clocked a little from my previous post (again) this is what I’m running now which i am happy with: +135 on the core +600 on memory, Peak boost 2115Mhz, Stabilises at 2055 after about 30 minutes and doesn’t move, temps at worst case are now 74 degrees - ran multiple bench marks to try and make it crash, no crashes, so I’m happy at the moment.

The strange thing is, i used the secondary BIOS switch and flashed the ASUS Strix BIOS (450W), flash went very smooth, if i don’t touch anything the system is very stable, GPU-Z reports 450W is being used under load which is great. But the odd thing is, when i overclock with +600 memory, and any sort of core clock adjustment at all +60, +75, +90 etc.... the performance is actually worse, and runs slightly hotter (78 degree’s ish) i cant wrap my head around that, Core clock reduces itself to around 1995mhz - 2020mhz while pulling 450w.

Can anybody explain this to me who is a bit more knowledgeable in OC’ing these cards?

I also forgot to mention, Auto OC Scanner does not work at all for me on either BIOS, I’m assuming i need a BETA version of Afterburner?


----------



## Nizzen

techenth said:


> Couldn't find a Strix or Gaming Trio so got a TUF 3080 OC instead. Stable at +270/700 benchs, +200/700 in games. 90% fan speed, temps around 60c. The card is dying for more power. Hope we can get a custom bios soon. The below is from Timespy without the shunt mod.
> 
> View attachment 2465580
> 
> 
> 
> Should I return the card and chase a Strix OC? Or should I be contempt with what I got


+270 and +200 says nothing about the actually frequency. 

Run Port Royal, and post the average gpuclock in that run. 
That is your real gpuclock on heavy load.


----------



## dr.Rafi

LukeOverHere said:


> Some very strange results i have had recently with my Colorful iGame 3080 Advance OC Edition.
> 
> As a reference point, the stock Colorful BIOS is 400W, i have actually down clocked a little from my previous post (again) this is what I’m running now which i am happy with: +135 on the core +600 on memory, Peak boost 2115Mhz, Stabilises at 2055 after about 30 minutes and doesn’t move, temps at worst case are now 74 degrees - ran multiple bench marks to try and make it crash, no crashes, so I’m happy at the moment.
> 
> The strange thing is, i used the secondary BIOS switch and flashed the ASUS Strix BIOS (450W), flash went very smooth, if i don’t touch anything the system is very stable, GPU-Z reports 450W is being used under load which is great. But the odd thing is, when i overclock with +600 memory, and any sort of core clock adjustment at all +60, +75, +90 etc.... the performance is actually worse, and runs slightly hotter (78 degree’s ish) i cant wrap my head around that, Core clock reduces itself to around 1995mhz - 2020mhz while pulling 450w.
> 
> Can anybody explain this to me who is a bit more knowledgeable in OC’ing these cards?
> 
> I also forgot to mention, Auto OC Scanner does not work at all for me on either BIOS, I’m assuming i need a BETA version of Afterburner?


First The card is dropping the boost clock because more tempratures Asus strix fans are bigger and have slower max rpm than yours so the bios is slowing your fans max speed which are smaller but if you find away to push them back( i think yours are 3000rpm max) and asus 2800 rpm ,not sure though, and for every 6 dgrees the card will boost 15 mhz, and Second thing is Asus strix bios need more +numbers to reach same performance , measured what the card is boosting to during load ,try 130 + or 140+ that was mine maxing with strix bios with heavy games before crash.
Hope this be helpfull .


----------



## VPII

LukeOverHere said:


> Some very strange results i have had recently with my Colorful iGame 3080 Advance OC Edition.
> 
> As a reference point, the stock Colorful BIOS is 400W, i have actually down clocked a little from my previous post (again) this is what I’m running now which i am happy with: +135 on the core +600 on memory, Peak boost 2115Mhz, Stabilises at 2055 after about 30 minutes and doesn’t move, temps at worst case are now 74 degrees - ran multiple bench marks to try and make it crash, no crashes, so I’m happy at the moment.
> 
> The strange thing is, i used the secondary BIOS switch and flashed the ASUS Strix BIOS (450W), flash went very smooth, if i don’t touch anything the system is very stable, GPU-Z reports 450W is being used under load which is great. But the odd thing is, when i overclock with +600 memory, and any sort of core clock adjustment at all +60, +75, +90 etc.... the performance is actually worse, and runs slightly hotter (78 degree’s ish) i cant wrap my head around that, Core clock reduces itself to around 1995mhz - 2020mhz while pulling 450w.
> 
> Can anybody explain this to me who is a bit more knowledgeable in OC’ing these cards?
> 
> I also forgot to mention, Auto OC Scanner does not work at all for me on either BIOS, I’m assuming i need a BETA version of Afterburner?


Hi there

I found the same when I flashed my MSI Gaming X Trio with the Strix bios. The problem I found with the three RTX 3080's I had was:

Palit Gamingpro OC would boost 300 to 315mhz from base boost clock
Gigabyte Eagle OC would boost 225mhz from base boost clock
MSI Gaming X Trio would boost 225mhz from base boost clock

When I flashed the MSI Gaming X Trio with the Strix OC bios I immediately saw a 3 to 4c temp increase at idle. When benching the card with more or less the same clocks, performance was not as good as with the MSI Gaming X Trio bios same with the Evga FTW3 Ultra bios. So at present I am back on the Gaming X Trio bios as it works best.


----------



## dr.Rafi

VPII said:


> Hi there
> 
> I found the same when I flashed my MSI Gaming X Trio with the Strix bios. The problem I found with the three RTX 3080's I had was:
> 
> Palit Gamingpro OC would boost 300 to 315mhz from base boost clock
> Gigabyte Eagle OC would boost 225mhz from base boost clock
> MSI Gaming X Trio would boost 225mhz from base boost clock
> 
> When I flashed the MSI Gaming X Trio with the Strix OC bios I immediately saw a 3 to 4c temp increase at idle. When benching the card with more or less the same clocks, performance was not as good as with the MSI Gaming X Trio bios same with the Evga FTW3 Ultra bios. So at present I am back on the Gaming X Trio bios as it works best.


Fans on strix are bigger and slower and have more silent curve because it have massive heat sink compare to your cards , and strix bios need more number in afterburner to reach the same perfomance.


----------



## VPII

dr.Rafi said:


> Fans on strix are bigger and slower and have more silent curve because it have massive heat sink compare to your cards , and strix bios need more number in afterburner to reach the same perfomance.


I'm not arguing with you I am just stating the difference in boost clocks between various AIB cards.


----------



## techenth

Nizzen said:


> +270 and +200 says nothing about the actually frequency.
> 
> Run Port Royal, and post the average gpuclock in that run.
> That is your real gpuclock on heavy load.





https://www.3dmark.com/3dm/53114090?




https://www.3dmark.com/3dm/53113941?




Should I keep it? What kind of performance improvement should I expect if I went for the Strix?

Update: https://www.3dmark.com/3dm/53144813?


----------



## VPII

techenth said:


> https://www.3dmark.com/3dm/53114090?
> 
> 
> 
> 
> https://www.3dmark.com/3dm/53113941?
> 
> 
> 
> 
> Should I keep it? What kind of performance improvement should I expect if I went for the Strix?


Your average temps seem a little high when looking the Time Spy benchmark. If you can get it cooler you average core clock speed would be above 2000 and your GPU score around 19000. The cpu does hold you back a little or a lot which also does not help that much. The card you have is great taken the clock speeds you can bench. If you get a Strix OC it may be better in many different ways, but you won't be assure of reaching the same max boost clock speed for the core. If you were to change your cpu you will already see a drastic increase even in the average clock speed. I;ve seen it on my system changing from my Ryzen 9 3950X to the Ryzen 9 5950X.

When you look at the comparison below between my best run with the 3950X and best run with the 5950X. It is only a slight improvement, but I managed 19400 GPU score which I could not get with my 3950X.



https://www.3dmark.com/compare/spy/15292190/spy/15161844#


----------



## techenth

VPII said:


> Your average temps seem a little high when looking the Time Spy benchmark. If you can get it cooler you average core clock speed would be above 2000 and your GPU score around 19000. The cpu does hold you back a little or a lot which also does not help that much. The card you have is great taken the clock speeds you can bench. If you get a Strix OC it may be better in many different ways, but you won't be assure of reaching the same max boost clock speed for the core. If you were to change your cpu you will already see a drastic increase even in the average clock speed. I;ve seen it on my system changing from my Ryzen 9 3950X to the Ryzen 9 5950X.
> 
> When you look at the comparison below between my best run with the 3950X and best run with the 5950X. It is only a slight improvement, but I managed 19400 GPU score which I could not get with my 3950X.
> 
> 
> 
> https://www.3dmark.com/compare/spy/15292190/spy/15161844#


Exactly the answer I was looking for, thanks. I have a new setup on its way, both temps and scores should get a tad better.


----------



## VPII

techenth said:


> Exactly the answer I was looking for, thanks. I have a new setup on its way, both temps and scores should get a tad better.


Sounds great, cannot wait to see what you'll get then. Look my system is an open bench setup on a Lian Li PC-T60 bench setup so it is open so I do not have a problem with air flow. which is why my temps are a lot lower but also with fan speed set to 100% while benching.


----------



## LukeOverHere

dr.Rafi said:


> First The card is dropping the boost clock because more tempratures Asus strix fans are bigger and have slower max rpm than yours so the bios is slowing your fans max speed which are smaller but if you find away to push them back( i think yours are 3000rpm max) and asus 2800 rpm ,not sure though, and for every 6 dgrees the card will boost 15 mhz, and Second thing is Asus strix bios need more +numbers to reach same performance , measured what the card is boosting to during load ,try 130 + or 140+ that was mine maxing with strix bios with heavy games before crash.
> Hope this be helpfull .


Thank you for this, that makes complete sense, if I no longer have the full RPM range on my fans then that explains the temp increase, and reduction of core clock as a result to the increased temperature.... so my next question is, should I try another 450W BIOS from another supplier, is there any you would recommend? Also I am running a stable version of Afterburner, do I need a BETA version to run the OC Scanner? I still cant get it to work? Thanks again !


----------



## asdkj1740

i have no idea why ftw3(ultra) got so many critics about the pcb.
ftw3 is $790 (on newegg us); ftw3 ultra is $820.
ftw3u 19+3 at the price range of ~$800 has the highest vcore "phase" count out there (since turing there is no more doublers).

for gddr6x, 3 phases should be enough because there are just 10 vram chips on 3080 compared to 24 chips on 3090 which is powered by 4 phases in general.
there are few 3080 aib models priced lower than ftw3u having 4 phases vram, namely gigabyte gaming oc 13+4 ($750) and gigabyte eagle oc 13+4 ($730) and asus tuf 16+4 ($699) and asus tuf oc 16+4 ($750) and colorful igame ultra oc 16+4 ($740) and igame advanced 16+4 ($780) and igame advanced oc 16+4 ($800).

rumor has it asus tuf ($699) is already discontinued. colorful cards are not avaliable worldwide. gigabyte suffers from the pcie adaptor problem seriously. so there is just one model clearly wins over ftw3u on vram phase count, tuf oc ($750).

what we should call evga out is that not just evga but lots of other brands' models have the corner vram phase not cooled at all. and evga ftw3u with 19+3 means there are only 2 vram phases having proper cooling.
tuf, again, wins because there is not corner vram phases on tuf pcb.








source: EVGA GeForce RTX 3080 FTW3 Ultra Review

other vram phase cooling missing examples








Colorful iGame GeForce RTX 3080 Vulcan OC Review - Impressive Overclocking


The Colorful iGame RTX 3080 Vulcan OC comes with a small fold-out LCD screen that displays vital statistics. Thanks to a large triple-slot cooler, Colorful's card achieves excellent temperatures, and the overclocking gains at the maximum power limit even beat the EVGA RTX 3080 FTW3 Ultra.




www.techpowerup.com












Gigabyte GeForce RTX 3090 Eagle OC Review


Gigabyte debuts its Eagle brand of graphics cards to the enthusiast segment. Slotted between the WindForce OC and AORUS Gaming series, the RTX 3090 Eagle OC covers all the bases and bling gamers need, at the NVIDIA MSRP price. It also comes with a power connector innovation.




www.techpowerup.com





outstanding design: tuf








ASUS GeForce RTX 3080 TUF Gaming OC Review


The ASUS GeForce RTX 3080 TUF Gaming OC comes with a small factory overclock and a huge cooler that achieves amazing temperatures that are much better than the NVIDIA Founders Edition. The TUF also features a dual BIOS with a "quiet" mode that makes this 4K monster card almost silent.




www.techpowerup.com






so ftw3u has the most vcore phase count but just one less vram phase count on pcb, plus only proper cooled two vram phases only.
and does this missing cooled vram phase important/need to be cooled well? i dont know, but check this gigabyte gaming oc out.








GIGABYTE 지포스 RTX 3080 Gaming OC 10GB


▲ GIGABYTE 지포스 RTX 3080 Gaming OC 10GB 동영상 그동안 입이 근질근질해서 참기 …




quasarzone.com















choice of vrm controller-->never seen this as a problem here, where dudes short shunt to get 600w up.
choice of input filtering and output filtering-->same as above.
actually, dudes here having 13+3 cards have no trouble to get 500w up.
and what the real enthusiasts truly want? high power limit bios only, no power throttlings.
digital vrm controller on board does not mean every user could access the controller and tweak things...after evga tin left i doubt kingpin 3090 voltage tweaker and ln2 bios would be avaliable on xdev.com.


the real problem of evga ftw3u is overpriced, and the hardware trick of pcie load balance/lock as frame chaser has covered, and vram mosfet cooling maybe a concern.


speaking of overpriced, msi gaming trio 13+3 is priced at $760 (but a dude in 3090 thread saying the cooler of msi gaming trio at 500w is insanely good).
whats more? gigabyte aorus series, master 16+4 / 2*8pin / 370w bios is charging $850 and xtreme (16+4 3*8pin and 450w bios avaliable) at $900.

it is really biased to not have a full picture of, if not all, aib pcb design and say ftw3u could be better.
and, again, stop saying the FE model as reference model. FE model is the custom design of nvidia. nvidia has reference design which could be found on lots of aib entry level models.
and if you take $699 FE design for granted, then there is only one aib model survives, tuf at $699 and it is extremely out of stock (some said it is discontinued already), the rest aib, **** them all and wait for amd rx6000.


btw, i have heard there are many dead cases of the ftw3 cards reported on reddit and evga forumrecently, and i have also seen evga ftw bios seems to allocate lots of power to gddr6x compared to other aib models, so...i dont know haha.


----------



## LukeOverHere

VPII said:


> Hi there
> 
> I found the same when I flashed my MSI Gaming X Trio with the Strix bios. The problem I found with the three RTX 3080's I had was:
> 
> Palit Gamingpro OC would boost 300 to 315mhz from base boost clock
> Gigabyte Eagle OC would boost 225mhz from base boost clock
> MSI Gaming X Trio would boost 225mhz from base boost clock
> 
> When I flashed the MSI Gaming X Trio with the Strix OC bios I immediately saw a 3 to 4c temp increase at idle. When benching the card with more or less the same clocks, performance was not as good as with the MSI Gaming X Trio bios same with the Evga FTW3 Ultra bios. So at present I am back on the Gaming X Trio bios as it works best.


Thanks for the reply. It's likely I need to wait a bit longer for a more compatible BIOS by the sounds of it, it still runs very well with the 400W Colorful BIOS, so I may as well leave it for now while it is stable.


----------



## DrWaffles

Hey guys, I understand you're all shunt modding but I'm not quite ready to do that just yet.

I've got a FTW3 with the 450w bios, but it starts power limiting around 410-425w annoyingly.. Some instances it does run at 450w, but it's usually dropping voltage and clocks before then. Is there any other components that would cause a power limiter to kick in early? 

Before reverting to the stock bios, I briefly tried the Strix OC bios that was posted here, but that only made GPU-Z report 8-Pin #3 power and voltage basically report nothing.
Reported power did drop, but it still was coming up as pwr limiter on GPU-Z (Reported board power was sub 380, don't remember exactly)
Anybody else experience that? Is my only option to get it actually drawing 450-500 shunting?


----------



## dr.Rafi

VPII said:


> I'm not arguing with you I am just stating the difference in boost clocks between various AIB cards.


I am not arguing either trying just to explain the rseon, i might sound rough in writing, but its just my bad english.


----------



## dr.Rafi

LukeOverHere said:


> Thank you for this, that makes complete sense, if I no longer have the full RPM range on my fans then that explains the temp increase, and reduction of core clock as a result to the increased temperature.... so my next question is, should I try another 450W BIOS from another supplier, is there any you would recommend? Also I am running a stable version of Afterburner, do I need a BETA version to run the OC Scanner? I still cant get it to work? Thanks again !


I tried Evga rtw3 ultra 450 watt ,have smaller fans and heat sink , the good stuff in this bios it kick voltage of each boost frequency , it gave me good results too but tiny bit lower than strix.


----------



## dr.Rafi

LukeOverHere said:


> Thank you for this, that makes complete sense, if I no longer have the full RPM range on my fans then that explains the temp increase, and reduction of core clock as a result to the increased temperature.... so my next question is, should I try another 450W BIOS from another supplier, is there any you would recommend? Also I am running a stable version of Afterburner, do I need a BETA version to run the OC Scanner? I still cant get it to work? Thanks again !


OC scanner not working for me too, but the latest beta can read the core voltage for Ampere, and can read and onscreen display the wattage consuming by the graphic ,you not shunting ? if you do you can correct the on screen display value of wattage using formula in after burner setting .


----------



## dr.Rafi

techenth said:


> https://www.3dmark.com/3dm/53114090?
> 
> 
> 
> 
> https://www.3dmark.com/3dm/53113941?
> 
> 
> 
> 
> Should I keep it? What kind of performance improvement should I expect if I went for the Strix?
> 
> Update: https://www.3dmark.com/3dm/53144813?


My Friend you running Lambergini on mountain rocky street, what you really need new cpu.


----------



## dr.Rafi

DrWaffles said:


> Hey guys, I understand you're all shunt modding but I'm not quite ready to do that just yet.
> 
> I've got a FTW3 with the 450w bios, but it starts power limiting around 410-425w annoyingly.. Some instances it does run at 450w, but it's usually dropping voltage and clocks before then. Is there any other components that would cause a power limiter to kick in early?
> 
> Before reverting to the stock bios, I briefly tried the Strix OC bios that was posted here, but that only made GPU-Z report 8-Pin #3 power and voltage basically report nothing.
> Reported power did drop, but it still was coming up as pwr limiter on GPU-Z (Reported board power was sub 380, don't remember exactly)
> Anybody else experience that? Is my only option to get it actually drawing 450-500 shunting?


Good day mate, with shunting you can go up to 700watt max with 3 x ppower connectors cards , max spported is 525 watt but with good power supply you easy run it pulling 700 watt, but hope you not worry for bills.


----------



## VPII

dr.Rafi said:


> I am not arguing either trying just to explain the rseon, i might sound rough in writing, but its just my bad english.


No worries boet....(Boet is brother in Afrikaans)... I really do understand and your comment was perfectly fine, same as your English.


----------



## VPII

dr.Rafi said:


> I am not arguing either trying just to explain the rseon, i might sound rough in writing, but its just my bad english.


Oh and what you stated about the fan speed makes perfect sense in all honesty. I never thought about it that way but it does make perfect sense.


----------



## DrWaffles

dr.Rafi said:


> Good day mate, with shunting you can go up to 700watt max with 3 x ppower connectors cards , max spported is 525 watt but with good power supply you easy run it pulling 700 watt, but hope you not worry for bills.


Good thing we're getting solar panels in a few weeks  

Any notes on the Strix bios not showing the 3rd power pin readings?


----------



## BluePaint

Interestingly, the 5800X @ 4.8Ghz with [email protected] memory hasn't actually improved my Timespy GPU score over the result with the 3900X + [email protected] RAM yet, lol.

But Firestrike really likes the 5000er CPUs. The leaderboard is dominated by them.
I managed to get 2nd place for FS extreme total score because it doesn't seem to use more then 8 cores 
FS Extreme 3080 Leaderboard
FSE 23510 GPU 22586 overall


----------



## DirtyScrubz

DOOOLY said:


> Well I received EK 3080 Strix water block but still waiting for a card 😥Here some pictures
> View attachment 2465437
> View attachment 2465438


I thought these weren’t released yet??

edit: nm just noticed you bought the acetal version.


----------



## cstkl1

what a screwed up optimization AC valhalla for nvidia
during game play the gpu hardly pushed the TGP around 300w.. but wait wait.. during certain menu etc.. it goes 400w.. or choices times.. 400w.. 

a deliberate to screw with nvidia is it..


----------



## asdkj1740

cstkl1 said:


> what a screwed up optimization AC valhalla for nvidia
> during game play the gpu hardly pushed the TGP around 300w.. but wait wait.. during certain menu etc.. it goes 400w.. or choices times.. 400w..
> 
> a deliberate to screw with nvidia is it..


this is ubisoft feature on ac series, the menu is the most stressing part.


----------



## cstkl1

asdkj1740 said:


> this is ubisoft feature on ac series, the menu is the most stressing part.


dude its affecting only nvidia gpu

you know the part where you get to choose gender.. full on 98-99% usage 150-160fps

400watts..

guess which card coming out tomorrow that coincidentally 300w. 
guess whats headlining last few days 3090 cannot hit 60fps. yeah sure at 300w..

btw you would think this the end of it right. nope. nvenc recording reduces fps because its with that 300w cap. 

so u tell me is it a coincidence 6800xt @300w.


----------



## werks

VULC said:


> Can anyone confirm if the Strix OC ROM is compatible with the Colorful Vulcan OC?


I just picked one up, do you know if its compatible?


----------



## FRAUSS 79

I 've this video cards , but the powerlimit is 320W . 
Also to push up to 108% on MSI afterburner stayto 320W. 

It's Normal?


----------



## joyzao

guys. I have a rog strix oc 3080, I was wondering if there are any bios that I can improve? I don't know if I was unlucky, but mine is not going up almost anything in the overclock 

Do you think ftw ultra is a better card than rog strix?


----------



## dr.Rafi

BluePaint said:


> Interestingly, the 5800X @ 4.8Ghz with [email protected] memory hasn't actually improved my Timespy GPU score over the result with the 3900X + [email protected] RAM yet, lol.
> 
> But Firestrike really likes the 5000er CPUs. The leaderboard is dominated by them.
> I managed to get 2nd place for FS extreme total score because it doesn't seem to use more then 8 cores
> FS Extreme 3080 Leaderboard
> FSE 23510 GPU 22586 overall


5800x is close to 3900x performance in general and wont beat it in multi threaded , and the funny thing is no where close to 10900k in games ,overclocked 10900kf and limited by max wattage use to 140 watt to make it close to 5800x maxed give me better scores than 5800 x and 3900x maxed too. 15500 cpu score timspy for 10900kf. @140 watt .


----------



## dr.Rafi

joyzao said:


> guys. I have a rog strix oc 3080, I was wondering if there are any bios that I can improve? I don't know if I was unlucky, but mine is not going up almost anything in the overclock
> 
> Do you think ftw ultra is a better card than rog strix?


Nope unless you lost the silicon lottery, strix is better card , and have best bios , and already overclocked by factory, only thing you can improve it by shunting .


----------



## Alemancio

I just got a Ventus OC, how bad is it really? Which BIOS do you recommend I run on it?

Thanks!

Edit: im seeing that some people have flashed the TUF or the Masters onto it but that the actual watt usage doesnt go beyond 320W???? Also there are different PCB versions???


----------



## dev1ance

So what's with all the dying EVGA cards? Have they figured it out yet?


----------



## Alemancio

dev1ance said:


> So what's with all the dying EVGA cards? Have they figured it out yet?


The what? Consider that I think EVGA has higher market share on 3080s


----------



## VPII

Okya, yes I used Dry Ice for my 5950X but I'm pretty happy with this..... The highest Fire Strike result with a RTX 3080....



https://www.3dmark.com/fs/24050137


----------



## Erik9519

I've recently gotten a Inno3D RTX 3080 iChill Frostbite and I was wondering, given the stock power limit of 340W, what bioses would be safe to flash on it to get a ~370W power limit?
I wanted to flash it because the card overclocks really well given the ~40C load temps but it's constantly smashing against the power limit which causes it to stay below 2GHz on heavy load.


----------



## dev1ance

Alemancio said:


> The what? Consider that I think EVGA has higher market share on 3080s


I don't think they have a higher market share than Colorful/Asus/MSI/Gigabyte combined. I don't see dead cards cropping up like FTW3.


----------



## acoustic

Who has dead cards? I haven't heard anything. My FTW3 is over a month old now, running great on the 450watt BIOS.


----------



## dev1ance

I don't understand why people are getting defensive and replying with their anecdotal evidence when there is a sizable number of complaints directly on EVGA forums regarding dying 3080s and 3090s. You don't see that with other AIBs. I merely asked if anyone knew anything new but I understand why I don't bother with these forums anymore. Enjoy your cards.


----------



## acoustic

I asked where you heard about dead cards and threw in that my card has been fine. LOL. Read your post to yourself and ask ... "who's getting defensive and riled up again?"


----------



## squadz

Alemancio said:


> I just got a Ventus OC, how bad is it really? Which BIOS do you recommend I run on it?
> 
> Thanks!
> 
> Edit: im seeing that some people have flashed the TUF or the Masters onto it but that the actual watt usage doesnt go beyond 320W???? Also there are different PCB versions???


I'd like to know this as well, same card. 

Better question is, if a bios does work and you can pass the limit, what sort of gains are we actually talking about fps wise?


----------



## xermalk

Has anyone flashed a TUF OC or non-oc to a bios with more then 375W PL?
Ort does the 3 pin cards bioses not work on 2 pin cards?
A 2 pin card with some thicker wires in the 8 pins would have no issue delivering way more power then 375w.


----------



## Alemancio

acoustic said:


> I asked where you heard about dead cards and threw in that my card has been fine. LOL. Read your post to yourself and ask ... "who's getting defensive and riled up again?"


Let him be, he's cray.



squadz said:


> I'd like to know this as well, same card.
> 
> Better question is, if a bios does work and you can pass the limit, what sort of gains are we actually talking about fps wise?


Mainly what it would do is being able to sustain certain boosted clocks. Imagine if all 3080 dies can do 2000Mhz at 1.063v but require 380W to do so. A ventus with 320W would struggle hard even getting beyond 1900MHz because it cannot provide sufficient amps to the core and thus it would downclock.


----------



## VladimirAG

Is this good result or..?


----------



## dr.Rafi

VladimirAG said:


> Is this good result or..?
> View attachment 2465971


1080 extreme more intensive


----------



## DStealth

yes it's good. What card Strix ? Modded and cooled ?
This is best i did with Evga on air probably better cooling will get 16.5+








Have a lot to re-bench with my 5900x holding 5ghz all cores in 3d.


----------



## asdkj1740

msi suprim x, yet another $900 16+4 custom model, lets see what will be commented after the "should be better" ftw3u 19+3 $820 lol.


----------



## Pupkin_San

Hi guys! I'm choosing between EVGA FTW3 and ASUS TUF. Wanna try hard modding and controlling the voltages. Do both cards support EVC2sx? I've seen in Der8auer video that TUF is very easily modded. Is it the same thing with FTW3? Thanks!


----------



## Colonel_Klinck

Shunts have arrived, so what shunts do you recommend I use on my TUF OC? I don't want to go mad, getting it up around 450w would be fine, its on water.


----------



## Shadowdane

Wccftech & TechpowerUp posted a review on the MSI Suprim X 3080 card:








MSI GeForce RTX 3090 SUPRIM X & RTX 3080 SUPRIM X Graphics Cards Review - Premium Design & Everything Else!


MSI has released its brand new SUPRIM X series in GeForce RTX 3090 & GeForce RTX 3080 flavors which I will be taking a look at today.




wccftech.com













MSI GeForce RTX 3080 Suprim X Review - The Biggest Graphics Card in the World


The MSI GeForce RTX 3080 Suprim X is the new flagship series from MSI. It comes with the biggest cooler we've ever seen, which achieves unbelievable noise levels, quieter than any other RTX 3080. A large factory OC is included, too, and a dual BIOS as well.




www.techpowerup.com





20 Phase (16+4) design with 430W BIOS. Seems might be the quietest card on the market that pushes 400W+.


----------



## Erik9519

Was a fix ever found for the power limit not working? I have a Inno3D RTX 3080 iChill Frostbite which is meant to have a stock power limit of 340W and yet it hovers around ~320W.
In the log during Port Royal it can be seen that the PerfCap reason is Pwr and that the % TDP hardly ever crosses 95%... I've tried drivers 456.38 and 457.30 neither one of them fixed the issue.


----------



## rankftw

Erik9519 said:


> Was a fix ever found for the power limit not working? I have a Inno3D RTX 3080 iChill Frostbite which is meant to have a stock power limit of 340W and yet it hovers around ~320W.
> In the log during Port Royal it can be seen that the PerfCap reason is Pwr and that the % TDP hardly ever crosses 95%... I've tried drivers 456.38 and 457.30 neither one of them fixed the issue.


I have a similar issue on my Palit Gaming Pro. Power limit is 350 but I get power perf cap at 330 with the power slider maxed.


----------



## SEALBoy

Hey guys just checking in if someone found a way around the MSI 3080 Ventus' 320W power limit without shunting.


----------



## Nizzen

Colonel_Klinck said:


> Shunts have arrived, so what shunts do you recommend I use on my TUF OC? I don't want to go mad, getting it up around 450w would be fine, its on water.


8 is perfect  Enough power for water


----------



## Colonel_Klinck

Nizzen said:


> 8 is perfect  Enough power for water


Cheers dude, job for the weekend.


----------



## arrow0309

Shadowdane said:


> Wccftech & TechpowerUp posted a review on the MSI Suprim X 3080 card:
> 
> 
> 
> 
> 
> 
> 
> 
> MSI GeForce RTX 3090 SUPRIM X & RTX 3080 SUPRIM X Graphics Cards Review - Premium Design & Everything Else!
> 
> 
> MSI has released its brand new SUPRIM X series in GeForce RTX 3090 & GeForce RTX 3080 flavors which I will be taking a look at today.
> 
> 
> 
> 
> wccftech.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI GeForce RTX 3080 Suprim X Review - The Biggest Graphics Card in the World
> 
> 
> The MSI GeForce RTX 3080 Suprim X is the new flagship series from MSI. It comes with the biggest cooler we've ever seen, which achieves unbelievable noise levels, quieter than any other RTX 3080. A large factory OC is included, too, and a dual BIOS as well.
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> 20 Phase (16+4) design with 430W BIOS. Seems might be the quietest card on the market that pushes 400W+.


There's my new bios to replace the 350W one on my X Trio. 😎


----------



## techenth

xermalk said:


> Has anyone flashed a TUF OC or non-oc to a bios with more then 375W PL?
> Ort does the 3 pin cards bioses not work on 2 pin cards?
> A 2 pin card with some thicker wires in the 8 pins would have no issue delivering way more power then 375w.


I did. Flashed the Strix's latest bios on my card. First-hand experienced 450W max TDP. 100% PL equals to 360W, goes up to 121%.
I had lower clocks than usual tho and PC rebooted without BSOD because of my PSU's inability to handle it. I'm switching to a new PC/PSU soon, will try it again.

It's safe from what I can tell. My guess is that you won't have PSU amperage issues if your PCI_E cable from the PSU has an 8 pin + 6 pin connector(pigtails) and you're feeding the card with two different rails.


----------



## Stash

arrow0309 said:


> There's my new bios to replace the 350W one on my X Trio. 😎


😎

I'm about to give it a spin. Only negative is not being able to re-set EEPROM protection after flashing but I figure that's the norm with the Ampere atm.


----------



## vigorito

What is factory power limit for asus strix rtx 3080 OC 320w or higher? and why you guys are flashing to 450w strix model?i mean what is real benefit


----------



## VPII

vigorito said:


> What is factory power limit for asus strix rtx 3080 OC 320w or higher? and why you guys are flashing to 450w strix model?i mean what is real benefit


The Asus RTX3080 Strix OC is 370watt stock and can increase to 450watt.... that is its normal bios.


----------



## vigorito

You mean i can flash it to 450w or spikes are 450w? im not planning to flash anything,i want to use it as it stock,i have only option to buy seasonic gx850 focus gold plus pair with a 5600x is it enought for strix?


----------



## VPII

vigorito said:


> You mean i can flash it to 450w or spikes are 450w? im not planning to flash anything,i want to use it as it stock,i have only option to buy seasonic gx850 focus gold plus pair with a 5600x is it enought for strix?


My friend, if you have the Asus RTX 3080 Strix OC then your card at nromal would run with a 370 watt power limit. In MSI afterburner you can increase that power limit to 450 watt that is normal.


----------



## xermalk

I did not quite expect this.
Flashed the STRIX OC bios on my TUF non oc, and massively lost performance (but gained 100w draw)

Core temp never went above 65c

STRIX 121/100
STRIX STOCK
TUFF - 117/100









Edit: managed to get up to 11333 points with the strix bios. but at a locked 0.9v and a power draw of 430-440w.
Going back to the original tuf bios as thats just silly.

Also anything at 1995 mhz or above is a guaranteed hard lock on my TUF.


----------



## rankftw

xermalk said:


> I did not quite expect this.
> Flashed the STRIX OC bios on my TUF non oc, and massively lost performance (but gained 100w draw)
> 
> Core temp never went above 65c
> 
> STRIX 121/100
> STRIX STOCK
> TUFF - 117/100


The TUF only has 2 PCIe power connectors and the Strix has 3. Your TUF isn't capable of pulling that much power and the BIOS will be messing with your card and not reporting correctly. You should just flash it back mate.


----------



## xermalk

rankftw said:


> The TUF only has 2 PCIe power connectors and the Strix has 3. Your TUF isn't capable of pulling that much power and the BIOS will be messing with your card and not reporting correctly. You should just flash it back mate.


It has dual bios  11821 with the TUFF bios now at 345W.
Also 2x8pin can easily handle 450w. Miners have been pulling 300w *per cable* without issues.
However, something on the card obviously cant handle it.


@zhrooms
The "375W PL is Bs ,and others have the exact same issue when i search around.
The slider can go that high, the card refuses. And shows power limit 1 whenever it passes 345.

First page should be changed until someone actually proves it can do 375W.


----------



## BluePaint

xermalk said:


> It has dual bios  11821 with the TUFF bios now at 345W.
> Also 2x8pin can easily handle 450w. Miners have been pulling 300w *per cable* without issues.
> However, something on the card obviously cant handle it.


It has nothing to do how much wattage an 8pin connector can provide but how PL is calculated. In short, a bios with PL calculation for 3x8 pins will not work correctly (and actually reduce performance) when u only have 2x physical 8pins. There are many many posts about this in this thread. Sorry!


----------



## Rbk_3

How is everyone's Timespy and Port Royal looking at stock? Mine seems a tad low. Average clocks are under well under 1900 especially for Timespy. Is that normal?



https://www.3dmark.com/pr/522284




https://www.3dmark.com/spy/15412304




Between the Timespy and the Port Royal I flashed the low temp bios so that is why the temps are so different.


----------



## MRLslidchen

The math is pretty easy for that. Let‘s start with the max. TDP for 2x 8Pin 375W Bios:

75W (PCIe Slot) + 150W (1st 8Pin) + 150W (2nd 8Pin) = 375W


Now the max. TDP 450W Bios for 3x 8Pin is split like this:

75W (PCIe Slot) + 125W (1st 8Pin) + 125W (2nd 8Pin) + 125W (3rd 8Pin) = 450W


Now lets say you flash the 450W Bios on a 2x 8Pin card then you will be missing the 3rd 8Pin to provide the additional 125W. But the Limits per rail is already set in the bios (75W for PCIe and 125W per pin). Meaning your card can only draw 325W (PCIe + 8Pin 1 + 8Pin 2). In the end, you would lose performance because of the missing 3rd pin.

I hope this helps. Still struggling with english.


----------



## Falkentyne

MRLslidchen said:


> The math is pretty easy for that. Let‘s start with the max. TDP for 2x 8Pin 375W Bios:
> 
> 75W (PCIe Slot) + 150W (1st 8Pin) + 150W (2nd 8Pin) = 375W
> 
> 
> Now the max. TDP 450W Bios for 3x 8Pin is split like this:
> 
> 75W (PCIe Slot) + 125W (1st 8Pin) + 125W (2nd 8Pin) + 125W (3rd 8Pin) = 450W
> 
> 
> Now lets say you flash the 450W Bios on a 2x 8Pin card then you will be missing the 3rd 8Pin to provide the additional 125W. But the Limits per rail is already set in the bios (75W for PCIe and 125W per pin). Meaning your card can only draw 325W (PCIe + 8Pin 1 + 8Pin 2). In the end, you would lose performance because of the missing 3rd pin.
> 
> I hope this helps. Still struggling with english.


This is not exactly true for TDP technically. Any rail reaching its limit will trigger PWR, even if all three are not at the limit. It just takes one to be there. The problem is, I haven't seen a single board that loads all three 8 pins evenly. It's usually something like: 150W+150W+75W+75W from what I've seen. Someone correct me if I'm wrong. I've also never seen a board with a 125W cap on any 8 pin.


----------



## MRLslidchen

That was just the math to explain why 450W bios doesn‘t perform better on 2x 8pin cards. It‘s the same topic with the 2080 Ti in the past when people flashed 3pin bios to get more than 380W on their 2x 8pin cards.


----------



## Rbk_3

VPII said:


> Okay, over the past month or so I've been playing around with a Palit RTX 3080 Gamingpro OC, Gigabyte Eagle RTX 3080 OC and now finally got a MSI RTX 3080 Gaming X Trio which is by far the best. The normal bios for this card works well and temps are pretty good. Taken that it does take 3 x 8 pin PCIe power connectors I did try the Asus Strix OC bios as well as the EVGA FTW3 ULTRA but temps were like 53 to 54c with fans at 100% where as with the MSI Gaming X Trio it would be 44 to 45c max. Yes I do understand that the increased power would mean increased heat but most of the benches I ran failed as well. Interestingly I've noticed now on three occasions that the max boost clocks per AIB is different.
> 
> If you take the Palit card with base clock 1740mhz it boost at stock 2040 to 2055mhz which means 300 to 315mhz.
> 
> With the Eagle OC you have a base of 1755mhz it would boost at stock to 1980mhz which means 225mhz boost.
> 
> With the MSI Gaming X Trio you have a base clock of 1815mhz it boost at stock to 2040mhz which means 225mhz boost from base.
> 
> When using the Asus Strix OC bios you have a base clock of 1905mhz and the max boost I have seen while running the same benchmark was 2040 which means 135mhz boost from base.
> 
> When using the Evga FTW3 Ultra bios you have a base clock of 1800mhz and the max boost I have seen was 2025mhz and as such a boost of 225mhz from base.
> 
> Now correct me if I am wrong. But from what I remember with the RTX 2080 Ti I had, the boost was 300mhz+ from base which makes the Palit the only card that actually boost more or less the same as the RTX 2080 ti. Look I may be wrong, but this is what I have seen from my tests, actually having a Palit, Eagle OC and now MSI Gaming X Trio.


Damn, so I think there is something wrong with my Trio. It is only boosting to max 1950 at stock in Port Royal with the only change being a 102 power limit. Is there something I am maybe doing wrong?



https://www.3dmark.com/pr/522284




I tried a manual OC to 2010MHZ at 0.95V and couldn't even get through a Time Spy. It made it though at 0.975 at 1995MHZ. I think I may have lost the lottery. 



https://www.3dmark.com/spy/15445883


----------



## xermalk

MRLslidchen said:


> The math is pretty easy for that. Let‘s start with the max. TDP for 2x 8Pin 375W Bios:
> 
> 75W (PCIe Slot) + 150W (1st 8Pin) + 150W (2nd 8Pin) = 375W
> 
> 
> Now the max. TDP 450W Bios for 3x 8Pin is split like this:
> 
> 75W (PCIe Slot) + 125W (1st 8Pin) + 125W (2nd 8Pin) + 125W (3rd 8Pin) = 450W
> 
> 
> Now lets say you flash the 450W Bios on a 2x 8Pin card then you will be missing the 3rd 8Pin to provide the additional 125W. But the Limits per rail is already set in the bios (75W for PCIe and 125W per pin). Meaning your card can only draw 325W (PCIe + 8Pin 1 + 8Pin 2). In the end, you would lose performance because of the missing 3rd pin.
> 
> I hope this helps. Still struggling with english.



I know what you mean. but its literally reporting a actual power draw of 4502
if its 125+125 i should never have even gotten to my old level.
I can also limit to the voltage and never trigger the "power limit" @ 450

But performance is still worse.

What i think is happening is that the strix bios is causing wrong power readouts form the sensors. That would explain it.


----------



## arrow0309

Stash said:


> 😎
> 
> I'm about to give it a spin. Only negative is not being able to re-set EEPROM protection after flashing but I figure that's the norm with the Ampere atm.


Keep us informed pal, I'm playing @2100 min on both WDL and ACV for the moment with the Trio's stock bios so there's no need to increase any power draw but I'll surely give it a run as well soon, maybe I'll even put her under water then it's a must.


----------



## Falkentyne

Rbk_3 said:


> Damn, so I think there is something wrong with my Trio. It is only boosting to max 1950 at stock in Port Royal with the only change being a 102 power limit. Is there something I am maybe doing wrong?
> 
> 
> 
> https://www.3dmark.com/pr/522284
> 
> 
> 
> 
> I tried a manual OC to 2010MHZ at 0.95V and couldn't even get through a Time Spy. It made it though at 0.975 at 1995MHZ. I think I may have lost the lottery.
> 
> 
> 
> https://www.3dmark.com/spy/15445883


You didn't lose the lottery. Port Royal will try to draw about 450-500W of power on ampere. That's way above your power limit. 340 watts * 102% is only 350 watts. Your card is going to throttle the boost clocks hard to maintain that.


----------



## Stash

Rbk_3 said:


> Damn, so I think there is something wrong with my Trio. It is only boosting to max 1950 at stock in Port Royal with the only change being a 102 power limit. Is there something I am maybe doing wrong?
> 
> 
> 
> https://www.3dmark.com/pr/522284
> 
> 
> 
> 
> I tried a manual OC to 2010MHZ at 0.95V and couldn't even get through a Time Spy. It made it though at 0.975 at 1995MHZ. I think I may have lost the lottery.
> 
> 
> 
> https://www.3dmark.com/spy/15445883


I wouldn't worry, my Trio was around 1980 at stock because it hits the max TDP quite easily and doesn't go any higher without variable instability.



arrow0309 said:


> Keep us informed pal, I'm playing @2100 min on both WDL and ACV for the moment with the Trio's stock bios so there's no need to increase any power draw but I'll surely give it a run as well soon, maybe I'll even put her under water then it's a must.


Gave it a brief exploratory run in Port Royal, 12k @ 2.1GHz @400w so not that much improvement over stock vBIOS tbh.


----------



## Rbk_3

Stash said:


> I wouldn't worry, my Trio was around 1980 at stock because it hits the max TDP quite easily and doesn't go any higher without variable instability.
> 
> 
> 
> Gave it a brief exploratory run in Port Royal, 12k @ 2.1GHz @400w so not that much improvement over stock vBIOS tbh.


What did you end up doing in terms of an underplot/OC? Do you have any Timespy or Port Royal runs I could look at?

I really hope MSI releases an official BIOS with a higher TDP. Ridiculous they have 3 8 PINS for the current BIOS.


I also got a TUF that seems to perform better in benches but the coil whine is unbarleable so I am going to send that one back.


----------



## Stash

Rbk_3 said:


> What did you end up doing in terms of an underplot/OC? Do you have any Timespy or Port Royal runs I could look at?
> 
> I really hope MSI releases an official BIOS with a higher TDP. Ridiculous they have 3 8 PINS for the current BIOS.
> 
> 
> I also got a TUF that seems to perform better in benches but the coil whine is unbarleable so I am going to send that one back.


https://www.3dmark.com/pr/524788 is the brief test I ran earlier.

With the Suprim coming out I doubt we'll ever see an official >350w TDP for the Trio, after all they didn't just add three MLCC clusters, few power phases, and a BIOS switch for just £200 right?

What's sad is that the Trio *should *have those features, else it's a waste of 3x8 but I digress, got a random card on launch; can't be too upset with it.

Can't remember on stock, think the lowest I undervolted was around 1920 @ 925mv.


----------



## Rbk_3

Stash said:


> https://www.3dmark.com/pr/524788 is the brief test I ran earlier.
> 
> With the Suprim coming out I doubt we'll ever see an official >350w TDP for the Trio, after all they didn't just add three MLCC clusters, few power phases, and a BIOS switch for just £200 right?
> 
> What's sad is that the Trio *should *have those features, else it's a waste of 3x8 but I digress, got a random card on launch; can't be too upset with it.
> 
> Can't remember on stock, think the lowest I undervolted was around 1920 @ 925mv.


Damn, that is better than I have been able to get, what settings was that at?

Best Port Royal I was able to get so far was at .975v 1995mhz. My average clocks are 120mhz lower than yours. 



https://www.3dmark.com/pr/523454


----------



## Alemancio

Do we have Boost analytics on Ampere cards? (that are stat sig. not just 1 random guy that plays at 2130MHz). I'd love to see p50s and p90s on boosting or max OC on air for a vast set of cards.

Thanks


----------



## dr.Rafi

xermalk said:


> I did not quite expect this.
> Flashed the STRIX OC bios on my TUF non oc, and massively lost performance (but gained 100w draw)
> 
> Core temp never went above 65c
> 
> STRIX 121/100
> STRIX STOCK
> TUFF - 117/100
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: managed to get up to 11333 points with the strix bios. but at a locked 0.9v and a power draw of 430-440w.
> Going back to the original tuf bios as thats just silly.
> 
> Also anything at 1995 mhz or above is a guaranteed hard lock on my TUF.


You can increase the clock speed more with strix bios so it will acheive similar scores but not better strix bios is designed to pull around 125 watt of each 8 pin connector max and 70 watt from pcie slot wich is equall to 445 watt total so in your case you have only 2 x8pin connectors so is maxing @ 320 watt while the bios thinking is 445 watt because all the exiciting power sources are maxed ,using Tuff bios is designed to pull 370 max 150 watt x 2 from each power rail and 70 watt of the slot so maxed @ 370 watt =better scores.


----------



## Stash

Rbk_3 said:


> Damn, that is better than I have been able to get, what settings was that at?
> 
> Best Port Royal I was able to get so far was at .975v 1995mhz. My average clocks are 120mhz lower than yours.
> 
> 
> 
> https://www.3dmark.com/pr/523454


Nothing special, just wanted to test that the PL & fans were working on the new BIOS. Judging by the temps from that run I'm guessing I was probably close to stock on all settings.

https://www.3dmark.com/pr/527230 is probably more accurate to what a flashed Suprim->Trio is capable of, tad underwhelming though! How's your cooling situation? Your 67°C average at 0.975v seems to be higher than I would expect, probably throttling your core speed a lot tbh.

Has any non-STRIX 3x8pin hit 13k PR on air?


----------



## VPII

Rbk_3 said:


> Damn, so I think there is something wrong with my Trio. It is only boosting to max 1950 at stock in Port Royal with the only change being a 102 power limit. Is there something I am maybe doing wrong?
> 
> 
> 
> https://www.3dmark.com/pr/522284
> 
> 
> 
> 
> I tried a manual OC to 2010MHZ at 0.95V and couldn't even get through a Time Spy. It made it though at 0.975 at 1995MHZ. I think I may have lost the lottery.
> 
> 
> 
> https://www.3dmark.com/spy/15445883


Hi there, for stock the boost is maybe 15 to 30mhz lower than what I get, but your temps is pulling it down. I usually run with temps not reaching 60c during the run as I set the fan to 100% when running benchmarks.


----------



## SoldierRBT

Stash said:


> Nothing special, just wanted to test that the PL & fans were working on the new BIOS. Judging by the temps from that run I'm guessing I was probably close to stock on all settings.
> 
> https://www.3dmark.com/pr/527230 is probably more accurate to what a flashed Suprim->Trio is capable of, tad underwhelming though! How's your cooling situation? Your 67°C average at 0.975v seems to be higher than I would expect, probably throttling your core speed a lot tbh.
> 
> Has any non-STRIX 3x8pin hit 13k PR on air?


Got 13286 in Port Royal with my FTW3 on air. A 120mm fan on the backplate seems to improve core stability (backplate gets hot).



https://www.3dmark.com/pr/526000



450W BIOS 1.056v locked + 240 Core
+1200 memory


----------



## ssgwright

damn that's an amazing score... I can only get about 12,700 on my tuff and that's on water. I don't have a backplate though... think I might need to get one.


----------



## Stash

Damn that's a mad one @ 56 degrees as well, did extra cooling on the backplate bring it down a lot? How does the FTW3 do around the 1v mark?

Think I found the sweetspot. avg: 2.1 @ 1012mv @ 61 degrees. Can't be sure if that's an improvement vs. 350w as I didn't run PR before I flashed... 🙃


----------



## SoldierRBT

Stash said:


> Damn that's a mad one @ 56 degrees as well, did extra cooling on the backplate bring it down a lot? How does the FTW3 do around the 1v mark?
> 
> Think I found the sweetspot. avg: 2.1 @ 1012mv @ 61 degrees. Can't be sure if that's an improvement vs. 350w as I didn't run PR before I flashed... 🙃


Thanks. Not much around 1-2C less. It let me add +15MHz on the core in the Port Royal run. I didn't see any improvement in memory OC with the 120mm fan on the backplate. After +1200 performance decrease. I did a video last month running Quake 2 RTX at 1.012v locked @ 2190MHz. Got 2175-2160MHz avg. after it gets around 60C. I'd recommend to test your OC with this game (free on steam).


----------



## Rheinfels

Without Reading all 124 Pages: I was able to get a 3080 Ventus shortly after release which is now under water. Currently I'm running [email protected]@about 250 Watt. I would like to test the maximum and would be therefore interested in a BIOS allowing to use the maximum of 2x 8Pin + PCIe connecter (= 375 Watt?!?). Are there any experiences yet with flashing this specific card? Which BIOS to use? I'm using 2x DP, but I guess that's supported by all 3080s? As the card ist watercooled I don't care much about fans or RGB that might be different.


----------



## xermalk

ssgwright said:


> damn that's an amazing score... I can only get about 12,700 on my tuff and that's on water. I don't have a backplate though... think I might need to get one.


Thats great for a TUF with only 2 pci-e pins. 
I'm stuck at 11881 ,and that's with the fans peggd at 100%, and doing a "cold" royal run and vcore locked to 0.925. 

Cant get above 1995 stable even at 0.95V. And even at 0.95 its power limited through the entire run. i need to go down to 0.9v to not be power limited.


----------



## Rheinfels

Rheinfels said:


> Without Reading all 124 Pages: I was able to get a 3080 Ventus shortly after release which is now under water. Currently I'm running [email protected]@about 250 Watt. I would like to test the maximum and would be therefore interested in a BIOS allowing to use the maximum of 2x 8Pin + PCIe connecter (= 375 Watt?!?). Are there any experiences yet with flashing this specific card? Which BIOS to use? I'm using 2x DP, but I guess that's supported by all 3080s? As the card ist watercooled I don't care much about fans or RGB that might be different.


Tested my Ventus with a XC3 Ultra Bios as it has the same Video Output/Stage and Power configuration. The results are strange:









I started using the same configuration in Afterburner, [email protected],875V. With the original Ventus Bios the card consumes about 280W during Timesspy and using the XC3 Ultra Bios the card consumes 100W(!) additonal power while configuring the same [email protected],875V in Afterburner and the clocks is slightly lower. Is there any logical reason for it? I went directly back to my Ventus Bios.


----------



## xermalk

Rheinfels said:


> Currently I'm running [email protected]@about 250 Watt.


In what benchmark , and what was the overclock you got from running the OC scanner?
My 3080 pulls 320w at 0.875v and 1950 core clock, and crashes within 30s in a port royal run.


----------



## Rheinfels

xermalk said:


> In what benchmark , and what was the overclock you got from running the OC scanner?
> My 3080 pulls 320w at 0.875v and 1950 core clock, and crashes within 30s in a port royal run.


250W is not correct, more likely 280W and did never run the OC Scanner. Gamed some hours Division 2 and Anno1800 for several hours, Timespy scores about 18257, Port Royal 11707 and TimeSpyExtreme 9122. Therefore seems stable.
Port Royal: https://www.3dmark.com/3dm/53417267?
TimeSpy: https://www.3dmark.com/3dm/53417077?
TimeSpyExtreme: https://www.3dmark.com/3dm/53417592?


----------



## Rbk_3

VPII said:


> Hi there, for stock the boost is maybe 15 to 30mhz lower than what I get, but your temps is pulling it down. I usually run with temps not reaching 60c during the run as I set the fan to 100% when running benchmarks.





Stash said:


> Nothing special, just wanted to test that the PL & fans were working on the new BIOS. Judging by the temps from that run I'm guessing I was probably close to stock on all settings.
> 
> https://www.3dmark.com/pr/527230 is probably more accurate to what a flashed Suprim->Trio is capable of, tad underwhelming though! How's your cooling situation? Your 67°C average at 0.975v seems to be higher than I would expect, probably throttling your core speed a lot tbh.
> 
> Has any non-STRIX 3x8pin hit 13k PR on air?


Thanks. What was the new bios that was using for that test? Suprim Bios will probably be better than anything else for our card.

I was able to almost get 12000 in Port. Seems to be the best I can get with out messing around with custom voltage curve and on stock bios. That crashed in Timespy however. +165 +550. Anything over +75 crashed in Warzone.



https://www.3dmark.com/pr/527229




I am going to mess around with the TUF tonight. If it is significantly better I will see if I can deal with the coil whine.



Edit


Looks like the Suprim bios is already available. Has anyone tried it?









MSI RTX 3080 VBIOS


10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory




www.techpowerup.com


----------



## Panat

What is best bios for *TUF-RTX3080-O10G-GAMING? Thanks *


----------



## xermalk

Panat said:


> What is best bios for *TUF-RTX3080-O10G-GAMING? Thanks *


The original* TUF-RTX3080-O10G-GAMING* BIOS


----------



## Panat

xermalk said:


> The original* TUF-RTX3080-O10G-GAMING* BIOS


 okay thanks.


----------



## Alemancio

Stash said:


> 2.1 @ 1012mv @ 61 degrees


Is it stable? I can bench the same, even pass stress test with 99.7% but BFV and WZ crash unless I set 1.037mv


----------



## techenth

Colonel_Klinck said:


> Cheers dude, job for the weekend.


Hey man, how did your weekend project go? Any before/afters?

Best I could do on my TUF OC without the mod: https://www.3dmark.com/3dm/53445247


----------



## cstkl1

[email protected]|50
[email protected]
strix [email protected]+165/+1350

https://www.3dmark.com/pr/530123


----------



## Colonel_Klinck

techenth said:


> Hey man, how did it your weekend project go? Any before/afters?
> 
> Best I could do on my TUF OC without the mod: https://www.3dmark.com/3dm/53445247


I haven't done it yet. Unfortunately I think I've caught Covid. Got tested today, woke up feeling like crap yesterday, fever and cough. If I'm feeling any better tomorrow I'll have a look at it, certainly not going to be working until at least Wednesday as I get the test results on Tuesday, I'll be amazed if its negative, its flu or covid.


----------



## Stash

Rbk_3 said:


> Thanks. What was the new bios that was using for that test? Suprim Bios will probably be better than anything else for our card.
> 
> I was able to almost get 12000 in Port. Seems to be the best I can get with out messing around with custom voltage curve and on stock bios. That crashed in Timespy however. +165 +550. Anything over +75 crashed in Warzone.
> 
> 
> 
> https://www.3dmark.com/pr/527229
> 
> 
> 
> 
> I am going to mess around with the TUF tonight. If it is significantly better I will see if I can deal with the coil whine.
> 
> 
> 
> Edit
> 
> 
> Looks like the Suprim bios is already available. Has anyone tried it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI RTX 3080 VBIOS
> 
> 
> 10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com


Yeah on the Suprim BIOS atm. It's okay, not sure it's worth it though, I don't think the extra TDP is useful unless you're going to put it under water. I'm jealous of the TUF tbh mate, if I could've picked any launch AIB it'd have been a TUF it completely eclipses the Trio imo. Probably won't flash back to stock though, nice to have the headroom if needed and the other stuff (fans, etc.) is 1:1 with the Trio as far as I can tell.



Alemancio said:


> Is it stable? I can bench the same, even pass stress test with 99.7% but BFV and WZ crash unless I set 1.037mv


Seems similar to my results, don't really play games but I gave Quake RTX a spin without issues though. From my testing [email protected] is definitely on the edge though, had a few benches fail intermittently if going >15MHz over... shame because I don't think the thermals at 1.037v are (avg. 61 vs 65ish for me) worth it.

(On air) undervolting is definitely the way on these cards, I'm guessing you could comfortably hit 2GHz on most cards at 925mv (or less)... the extra 0.1v (ish) required and subsequent heat for 100MHz seems like a bad trade. Wish there was some good data like PR but for underclocking e.g. score adjusted for voltage or similar.


----------



## cstkl1

Colonel_Klinck said:


> I haven't done it yet. Unfortunately I think I've caught Covid. Got tested today, woke up feeling like crap yesterday, fever and cough. If I'm feeling any better tomorrow I'll have a look at it, certainly not going to be working until at least Wednesday as I get the test results on Tuesday, I'll be amazed if its negative, its flu or covid.


breathing difficulty and fever the main indicator.


SoldierRBT said:


> Thanks. Not much around 1-2C less. It let me add +15MHz on the core in the Port Royal run. I didn't see any improvement in memory OC with the 120mm fan on the backplate. After +1200 performance decrease. I did a video last month running Quake 2 RTX at 1.012v locked @ 2190MHz. Got 2175-2160MHz avg. after it gets around 60C. I'd recommend to test your OC with this game (free on steam).


interesting..that quake rtx
in strix bios.. whatever i do for quake rtx.. i will still be stuck at 1995-2010.. 
alt tab it goes back up but it will drop to 0.9xx v...


----------



## Colonel_Klinck

Morning all. Any advice on why this score is so low. I appear to have higher average core frequency than others with higher scores. I'm a tad confused by that. I'm still bouncing off power limit on occasions. TUF OC 
2130Mhz @.987
+1200 mem



https://www.3dmark.com/3dm/53465162?


----------



## VPII

Colonel_Klinck said:


> Morning all. Any advice on why this score is so low. I appear to have higher average core frequency than others with higher scores. I'm a tad confused by that. I'm still bouncing off power limit on occasions. TUF OC
> 2130Mhz @.987
> +1200 mem
> 
> 
> 
> https://www.3dmark.com/3dm/53465162?


Drop[ your memory clock by 250Mhz and see if it improves


----------



## Erik9519

Well, I've discovered why some cards won't consistently boost to the advertised TGP. There seems to be a separate power limit set on the GPU Chip Power Draw metric. I was able to test it by checking in GPU-Z all the power consumption metrics in Port Royal and then comparing them with a FurMark run. The FurMark run does use all the power up to the rated TGP however you will notice how the GPU Chip Power Draw metric is much lower than in Port Royal.
My specific card seems to have a cap on the GPU Chip Power Draw set at around 180W.

Can anybody else test this? And also report on the GPU Chip Power Draw cap seen in Port Royal?


----------



## SoldierRBT

cstkl1 said:


> breathing difficulty and fever the main indicator.
> interesting..that quake rtx
> in strix bios.. whatever i do for quake rtx.. i will still be stuck at 1995-2010..
> alt tab it goes back up but it will drop to 0.9xx v...


What's your power limit? Let it run for a few minutes then check what was the lowest voltage under load. Then lock that voltage in MSI Afterburner and increase clocks. With 450W BIOS and no memory OC 1.012v is the max it can do without hitting PL. Constant 440-450W load.

TimeSpy Extreme is by far the most intense test. Voltage drops to like 0.9XX under load. These cards need 500W BIOS.


----------



## cstkl1

SoldierRBT said:


> What's your power limit? Let it run for a few minutes then check what was the lowest voltage under load. Then lock that voltage in MSI Afterburner and increase clocks. With 450W BIOS and no memory OC 1.012v is the max it can do without hitting PL. Constant 440-450W load.


strix 3080 cant do 440-450.. even-though the bios looks like it does. its normally 400-420 with spikes to 440. had this on air also. 

only my 3090 strix was full on 47x watts


----------



## Falkentyne

Erik9519 said:


> Well, I've discovered why some cards won't consistently boost to the advertised TGP. There seems to be a separate power limit set on the GPU Chip Power Draw metric. I was able to test it by checking in GPU-Z all the power consumption metrics in Port Royal and then comparing them with a FurMark run. The FurMark run does use all the power up to the rated TGP however you will notice how the GPU Chip Power Draw metric is much lower than in Port Royal.
> My specific card seems to have a cap on the GPU Chip Power Draw set at around 180W.
> 
> Can anybody else test this? And also report on the GPU Chip Power Draw cap seen in Port Royal?


What is your TGP?
This seems wrong. GPU Chip power draw's power limit is 300 watts on 3090 that is set to 400W TDP. There's no way at all it's only 180W on a 3080. It's usually around 75% of the board power draw limit.
Furmark intentionally throttles the card and reports throttle at base clocks. It's been throttled like this for over 10 years.
You are correct that there are different power limits. TGP is board power draw. PCIE slot and Chip Power Draw have their own limits. I am unsure if the individual 8 pin connectors have individual limits where pwr is triggered if one gets too high by itself, or if another 8 pin ramps up without signaling a limit (Board power Limit is equal to 8 pins + PCIE slot).


----------



## Erik9519

Falkentyne said:


> What is your TGP?
> This seems wrong. GPU Chip power draw's power limit is 300 watts on 3090 that is set to 400W TDP. There's no way at all it's only 180W on a 3080. It's usually around 75% of the board power draw limit.
> Furmark intentionally throttles the card and reports throttle at base clocks. It's been throttled like this for over 10 years.
> You are correct that there are different power limits. TGP is board power draw. PCIE slot and Chip Power Draw have their own limits. I am unsure if the individual 8 pin connectors have individual limits where pwr is triggered if one gets too high by itself, or if another 8 pin ramps up without signaling a limit (Board power Limit is equal to 8 pins + PCIE slot).


Well my card has TGP of 340W however in Port Royal the GPU Chip Power Draw never crosses 180W with total TGP of ~320W. In FurMark however the card boosts to 1980MHz (Stock) with TGP of ~340W and GPU Chip Power Draw of ~140W. The card is a Inno3D RTX 3080 iChill Frostbite.
I've attached the logs to this post so that you can see too.


----------



## Falkentyne

Erik9519 said:


> Well my card has TGP of 340W however in Port Royal the GPU Chip Power Draw never crosses 180W with total TGP of ~320W. In FurMark however the card boosts to 1980MHz (Stock) with TGP of ~340W and GPU Chip Power Draw of ~140W. The card is a Inno3D RTX 3080 iChill Frostbite.
> I've attached the logs to this post so that you can see too.


Chip power draw in PR is always going to be lower.
Run Time Spy. You will see it higher in TS than PR.


----------



## Erik9519

Falkentyne said:


> Chip power draw in PR is always going to be lower.
> Run Time Spy. You will see it higher in TS than PR.


I've just done a run in Time Spy, it behaves very similarly to Port Royal. It just doesn't wanna go beyond ~320W TGP. I believe the cap to be GPU Chip Power Draw at ~180W because both MVDDC and PWR_SRC are higher in FurMark. I've contacted Inno3D about it and they basically told me this happens because you're hitting a power limit somewhere however, they weren't willing to update the BIOS to sort this out...


----------



## cstkl1

Erik9519 said:


> I've just done a run in Time Spy, it behaves very similarly to Port Royal. It just doesn't wanna go beyond ~320W TGP. I believe the cap to be GPU Chip Power Draw at ~180W because both MVDDC and PWR_SRC are higher in FurMark. I've contacted Inno3D about it and they basically told me this happens because you're hitting a power limit somewhere however, they weren't willing to update the BIOS to sort this out...


theres some wattage reserved for fans, led, vram 

and from kingpin card. i can guess theres a reference loadline used as guardband. 

just making assumptions here. but thats my theory. also tgp calculation on die has a thermal consideration into the formula.


----------



## Erik9519

cstkl1 said:


> theres some wattage reserved for fans, led, vram
> 
> and from kingpin card. i can guess theres a reference loadline used as guardband.
> 
> just making assumptions here. but thats my theory. also tgp calculation on die has a thermal consideration into the formula.


Perhaps, but I've seen some people here trying to cross-flash bios to increase power limit and they also encountered this issue that the TGP won't go past what their original bios did. There must be some kind of limitation outside of the BIOS somewhere and I'm suspecting that to be the GPU Chip Power Draw. If somebody else with such a card could perform these tests we could perhaps pin point it.
Regardless, personally that would be very unfortunate as it makes having a water block on the card pretty pointless imo.


----------



## c0nsistent

Has anyone tried flashing another bios onto an EVGA XC3 to actually get more than 340W TDP?


----------



## c0nsistent

The only way to keep my XC3 from power limiting in Unigine Superposition 4K Optimized is to limit the voltage to .875 and it STILL power limits down to .862 at times during the bench @ 100% fan speed and temps never going over 60C. If I step up to a 3090, I don't have to ship my card until the card I want is in stock right? I pay the difference and do they ship the card first or do I ship mine first? *never mind, I read the FAQ*


----------



## munternet

Finally got my 3080 Tuf OC yesterday that I ordered the day after release 
I haven't plumbed it in yet and I'm wondering if I even need to? It seems to work fine on air.
I'll have to run a test and see how it performs. Is the standard timespy ok?
Edit:
Ran a timespy without any gpu adjustments
Is this about normal?
Cheers


----------



## Rbk_3

Stash said:


> Yeah on the Suprim BIOS atm. It's okay, not sure it's worth it though, I don't think the extra TDP is useful unless you're going to put it under water. I'm jealous of the TUF tbh mate, if I could've picked any launch AIB it'd have been a TUF it completely eclipses the Trio imo. Probably won't flash back to stock though, nice to have the headroom if needed and the other stuff (fans, etc.) is 1:1 with the Trio as far as I can tell.


I kept the TUF. The fans make an annoying pitch noise and it has more coil while at load but it was a better chip and I wear headphones when I game. I got over 12200 on Port Royal but couldn't get over 12000 on the Trio no matter what I did. The TUF was just with a simple OC and to get near 12000 on the Trio I needed to do a custom curve. If I mess around with that on the TUF I can definitely improve my score.

Here is a comparison of the 2. Trio custom curve of I believe 1.075V/2025mhz +550 memory vs +165 +800 on the TUF.



https://www.3dmark.com/compare/pr/531701/pr/531367








munternet said:


> Finally got my 3080 Tuf OC yesterday that I ordered the day after release
> I haven't plumbed it in yet and I'm wondering if I even need to? It seems to work fine on air.
> I'll have to run a test and see how it performs. Is the standard timespy ok?
> Edit:
> Ran a timespy without any gpu adjustments
> Is this about normal?
> Cheers
> View attachment 2466459


Looks about right. Here is mine stock with no settings changed. 


https://www.3dmark.com/spy/15443598


----------



## smoke2

Do someone have experience with both ASUS RTX 3080 TUF OC vs. Strix OC?
Whats the noise in load with stock OC in the rig? Which one is queiter, if at all? ? I can't find a comparison anywhere.


----------



## ViTosS

Finally was able to buy an RTX 3080 in Brazil, after many days trying, got the FTW 3 Ultra, waiting for the delivery now


----------



## sakete

Well, I traded up, from my XC3 Ultra to an FTW3 Ultra. Man, this FTW3 is HUGE! Once I get my Optimus waterblock, time for some serious OC'ing.


----------



## Stash

Rbk_3 said:


> I kept the TUF. The fans make an annoying pitch noise and it has more coil while at load but it was a better chip and I wear headphones when I game. I got over 12200 on Port Royal but couldn't get over 12000 on the Trio no matter what I did. The TUF was just with a simple OC and to get near 12000 on the Trio I needed to do a custom curve. If I mess around with that on the TUF I can definitely improve my score.


Nice work on the TUF fella, I reckon with the right mods/cooling you could hit close to 13k with it (with a bit of luck).

850mv @ 2GHz

I refuse to believe this would be too stable outside of synthetic tests though. Need reccs on what is good to try it on.


----------



## TrixAre4FatKids

Anyone Try flashing a different bios on a ventus X3 yet? Preferably the Strix i just got mine in today, I can hit 2140/10,200 on stock bios, But id love to see more, Just dont want to brick something and risk no stock for 9 months


----------



## Micko

I don't know whether i should write this post here or in PSU forum but here it goes.. 

I bought the Asus 3080 TUF last week, undervolted it to 0.95v/1965MHz/+500 mem and everything was working as it should until last night when pc shut down during the Heaven benchmark. No bsod, just hard power off and restart. About 30 min later pc shut off again during playing a game. At first i thought i might have an unstable overclock so i changed it to 0.9v/1830MHz/+0 mem and left it there. But this morning pc shut off again during browsing and it then shut down during the windows boot 2 times in a row. Again, 30 min later it shut down when nothing was going on. 

I googled a bit and everything points to possible PSU failure. I have 9 years old Silverstone Strider 750w. Rest of my spec is Intel 6700k 4.6GHz, Asus Z170Pro Gaming and 2x8GB Kingston 3000 ram. What do you guys think ? And what would be the best way to poinpoint which part of pc is failing ? I don't have a spare PSU or friends with spare parts i could borrow.


----------



## BluemoonRisen

Did anyone try flashing the new ZOTAC 3080 AMP Holo BIOS to the Trinity 3080 ?


----------



## TK421

has anyone here who OC both 3080 and 3090 find that the silicon quality is better on 3090? curious

both generally hit the same power limit and frequency, but the 3090 does it with more cores being active on the silicon, lower power/volt required to sustain (lower leakage)

curious if it's only my feelings or if someone actually experienced it






BluemoonRisen said:


> Did anyone try flashing the new ZOTAC 3080 AMP Holo BIOS to the Trinity 3080 ?


is it the same pcb though? or is it just a different cooler with price markup?


----------



## BluemoonRisen

TK421 said:


> is it the same pcb though? or is it just a different cooler with price markup?


I don´t know exactly. Zotac said at least it´s the same layout.

I also reached out on future BIOS releases for the Trinity and they said nothing is planned.

I guess i have to shunt my Trinity then, because no other way in increasing power limit


----------



## TK421

BluemoonRisen said:


> I don´t know exactly. Zotac said at least it´s the same layout.
> 
> I also reached out on future BIOS releases for the Trinity and they said nothing is planned.
> 
> I guess i have to shunt my Trinity then, because no other way in increasing power limit


using hotglue on the shunts seem to be the best idea:


----------



## Deviem

Stash said:


> 😎
> 
> I'm about to give it a spin. Only negative is not being able to re-set EEPROM protection after flashing but I figure that's the norm with the Ampere atm.


Are you able to control rgb on TRIO with SUPRIM bios ?


----------



## Sparkster

Deviem said:


> Are you able to control rgb on TRIO with SUPRIM bios ?


Yes you are, however I could only use the effects that were available on the original bios. it appears to have new effects for the suprim, these just do the rainbow puke on mine. Other than that it works fine.


----------



## Dreamliner

I heard someone reverence a controversy with the EVGA 3080 FTW3 but I am not sure what he was talking about. Something about PCIE power load balancing? Does anyone know?

I also think I read something about the XC3 models being able to have the BIOS flashed as well and the entire 3080 EVGA stack is effectively identical, is this right?

I've still been unable to order a 3080 from anyone but I don't know what to get now. I was thinking of the FTW3 ULTRA, but know I'm leaning towards the TUF or maybe STRIX, though I have a TUF board so matching...


----------



## eliwankenobi

Dreamliner said:


> I heard someone reverence a controversy with the EVGA 3080 FTW3 but I am not sure what he was talking about. Something about PCIE power load balancing? Does anyone know?
> 
> I also think I read something about the XC3 models being able to have the BIOS flashed as well and the entire 3080 EVGA stack is effectively identical, is this right?
> 
> I've still been unable to order a 3080 from anyone but I don't know what to get now. I was thinking of the FTW3 ULTRA, but know I'm leaning towards the TUF or maybe STRIX, though I have a TUF board so matching...


Yes, you’ll find what you are looking for by looking at the videos from Frame Chasers on YouTube. 

He talks about EVGA doing load balancing across all power sources on the board including the PCIE slot power (75w). And because of that, he even shunt modded those PCIe resistors to effectively bypass the power limits and get the thing to pull about 600 watts. The dude is purely performance driven. Also a bit of a mad scientist 

EDIT: Forgot to add that he also flashed the FTW3 XOC BIOS into the XC3 and it worked. Also spoke about that BIOS being compatible with the MSI Gaming X Trio and that with the X Trio, you don’t need to mod the PCIe shunt resistors. Only the 8-pin ones and you’re good to go.

His Ampere videos are here: Ampere Launch Review


----------



## eliwankenobi

Also wanted to introduce myself here as new soon to be owner of the EVGA 3080 XC3 Ultra. 

Any sure shot recommendations to get a good OC out of this thing? Don’t wanna end up having to do what the Frame Chasers guy I referenced above did.


----------



## Erik9519

eliwankenobi said:


> Also wanted to introduce myself here as new soon to be owner of the EVGA 3080 XC3 Ultra.
> 
> Any sure shot recommendations to get a good OC out of this thing? Don’t wanna end up having to do what the Frame Chasers guy I referenced above did.


If you could let us know what GPU-Z reports for Board power usage during Furmark and Port Royal with the power slider at default and maxed that would be great. My card for instance being a Inno3D 3080 iChill Frostbite rated for 340W won't use more than 320W in anything but Furmark. Changing bioses as reported by others for certain cards also seems to do absolutely nothing which is very disappointing...


----------



## eliwankenobi

Erik9519 said:


> If you could let us know what GPU-Z reports for Board power usage during Furmark and Port Royal with the power slider at default and maxed that would be great. My card for instance being a Inno3D 3080 iChill Frostbite rated for 340W won't use more than 320W in anything but Furmark. Changing bioses as reported by others for certain cards also seems to do absolutely nothing which is very disappointing...


Will do and post here. I don’t believe it will exceed the 320w limit. I think that is XC3’s power limit.

What clocks are you getting sustained?


----------



## MRLslidchen

Finally got a Waterblock for my 3080 TUF OC. I used the TUF Thermal Pads instead of EK’s. They are better and killed the coil whine. Amazing how stable the clocks can get once watercooled. Especially in RTX Benchmarks.











https://www.3dmark.com/pr/541752


----------



## Erik9519

eliwankenobi said:


> Will do and post here. I don’t believe it will exceed the 320w limit. I think that is XC3’s power limit.
> 
> What clocks are you getting sustained?


When playing COD Cold War I get about 2010MHz-2040MHz sustained with +60 core (Base boost is 1770MHz) and +700 memory at about ~40C. Stuff like TimeSpy though it'll go as low as ~1850MHz cause power limit is just way too low.



MRLslidchen said:


> Finally got a Waterblock for my 3080 TUF OC. I used the TUF Thermal Pads instead of EK’s. They are better and killed the coil whine. Amazing how stable the clocks can get once watercooled. Especially in RTX Benchmarks.
> 
> View attachment 2466659
> 
> 
> 
> 
> https://www.3dmark.com/pr/541752


Is that mad result gaming stable too? I've noticed myself I can crank the frequency to +120 in Port Royal which it still gets me clocks lower than yours but then it crashes within seconds in COD Cold War.


----------



## MRLslidchen

That score was made with a +195 offset, no way that is stable. But haven’t tested if +210 ist doable. So far, even under air, i can play my games with +150. Will test Cold war once i get my redeem code.

I also noticed something with my TUF. My Memory Error Correction doesn‘t seem to work at all. Once i go past 900 offest on memory, all benches/games crashes, instead of lower framerates/performance. I mean i like it, since it tells me precisely where my limit is but strange behaviour anyway.


----------



## Omar Rana

I bought an evga rtx 3080 ftw3 ultra.
Unfortunately , i cannot overclock the Gpu clock offset to more 114mhz. i tried the oc bios with 450 watt , but it still crashes if i move the slider more then 114 mhz. my power supply is 850 watt.
My specs are_
ryzen 3600
16 gb ddr4
*ASRock Fatal1ty B450 Gaming-ITX/ac mainboard.

I still have 7 more days to return the product.*

I previously had bought msi 3080 ventus 3x , which could overclock to 104 mhz and was way quiter.
i spent 100 euro more on this graphic cards but seems like nothing worth it.

Any idea what i can try? i plan to water cool the gpu after 1 year.


----------



## Erik9519

Omar Rana said:


> I bought an evga rtx 3080 ftw3 ultra.
> Unfortunately , i cannot overclock the Gpu clock offset to more 114mhz. i tried the oc bios with 450 watt , but it still crashes if i move the slider more then 114 mhz. my power supply is 850 watt.
> My specs are_
> ryzen 3600
> 16 gb ddr4
> ASRock Fatal1ty B450 Gaming-ITX/ac mainboard.
> 
> I still have 7 more days to return the product.
> 
> I previously had bought msi 3080 ventus 3x , which could overclock to 104 mhz and was way quiter.
> i spent 100 euro more on this graphic cards but seems like nothing worth it.
> 
> Any idea what i can try? i plan to water cool the gpu after 1 year.


It's a silicon lottery game mate. If you want higher guaranteed base you're gonna have to go with a Strix OC where the base boost clock is 1935MHz in the OC profile. Slightly higher than your 1800 +114MHz oc but whether it's worth to pay the pay premium or not I think that's debatable especially given how hard it is to get a hold of one. If you really want the best at all costs sure go for it but again don't expect crazy OC, it's a lottery game.
Personally I'd just keep it and go ahead with the water cooling.

At least you're not stuck with a 320W power limit like I am 🤣


----------



## cstkl1

acv acv acv

its like nuts. yes. a girl animus to a viking who dreamscape to past life in ASGARD!!


----------



## ragnarok666

Hi folks,

I recently got a Galax RTX 3080 SG. I got this one, since it sold almost 200$ below of the RSVP of the FE here in Japan (Nvidia is asking a greedy 110.000Yen =~ 1000USD) and also got decent reviews. Now I found myself stuck with a 100%/320W power target and a stupid fan behaviour. The fan jumps from 0 straight to 50%, this also happens occationaly while just watching a video. In Afterburner I can only set the fan as low as 30%, no zero fan mode available in it.

So I would like to flash a different bios, with a slightly higher power targer (to get to stable 2Ghz average; currently at 1950 average in Port Royale) and better fan control. Did anyone already flash this card? Which bios did you use?
Or would it be safe to flash any bios from another card with an reference PCB and 2x8 pin? Any concerns of dlashing a bios from a card with more the 16 power stages, like mine has?

Thanks for the advice.


----------



## mattxx88

i got my tuf yesterday and did some tests, dunno why mine card is capped @340w 



no chance to get it higer, i just checked the bios and is in performance mode 
someone also got this issue?


----------



## TK421

mattxx88 said:


> i got my tuf yesterday and did some tests, dunno why mine card is capped @340w
> 
> 
> 
> no chance to get it higer, i just checked the bios and is in performance mode
> someone also got this issue?


even with reinstall drivers etc doesn't help?


----------



## mattxx88

TK421 said:


> even with reinstall drivers etc doesn't help?


yep, before uninstalling the previous gpu i did a clean driver uninstall with ddu as i use to do when installing new hardware

the only thing remains me to try is a full driver installation, cause the ones i have now was slimmed with NVcleanstall


----------



## TK421

mattxx88 said:


> yep, before uninstalling the previous gpu i did a clean driver uninstall with ddu as i use to do when installing new hardware
> 
> the only thing remains me to try is a full driver installation, cause the ones i have now was slimmed with NVcleanstall


that could be a fix, try it


----------



## Omar Rana

Erik9519 said:


> It's a silicon lottery game mate. If you want higher guaranteed base you're gonna have to go with a Strix OC where the base boost clock is 1935MHz in the OC profile. Slightly higher than your 1800 +114MHz oc but whether it's worth to pay the pay premium or not I think that's debatable especially given how hard it is to get a hold of one. If you really want the best at all costs sure go for it but again don't expect crazy OC, it's a lottery game.
> Personally I'd just keep it and go ahead with the water cooling.
> 
> At least you're not stuck with a 320W power limit like I am 🤣


I had an MSI Ventus 3080 3x which was stuck at 320 W, and honestly produced less heat and noise than this evga. It was 78 euro cheaper, but I returned it within 14 days' windows and got the EVGA RTX 3080 ftw ultra. Now I am regretting it, even though I can move the slider to 450 watts, it never allows me to go past 114 MHz. I have seen other people with the same card going past the slider to 150 easily. I don't know, could be my power supply or my motherboard which is an itx form.Anyways,
Feel like returning it and wait for new stock and get MSI again.


----------



## VPII

I was pretty happy trying out the MSI SUprim bios with my MSI Gaming X Trio. .... Temps were a little hotter, more like load temps as in 5 to 6c hotter, but still performance was incredible to say the least. Cannot clock the card as high with the Suprim bios, but at least it keeps the clocks for longer.

Time Spy average clock 55mhz lower than max clock, however mac clock for this would have been only 2130 from clocks seen during Time Spy run


https://www.3dmark.com/spy/15556705



Time Spy Extreme average clock 40mhz lower than max clock


https://www.3dmark.com/spy/15558234



Still it is pretty nice seeing this. I am back on the Gaming X Trio bios now, prefer to rather keep with original bios unless I benchmark the card.


----------



## ausmisc

BluemoonRisen said:


> Did anyone try flashing the new ZOTAC 3080 AMP Holo BIOS to the Trinity 3080 ?


Yeah flashed it to my Trinity non-OC no issues. Have only tested the first 2 DP as that's all I use. Seeing power draw up to a max of 370w in benching. Don't know if the RGB still works and couldn't care less. Both graphic scores are top 3% for 3080, seems decent for a "cheap" 2x8pin.



https://www.3dmark.com/spy/15590271




https://www.3dmark.com/pr/531935


----------



## Stash

Deviem said:


> Are you able to control rgb on TRIO with SUPRIM bios ?


Not sure mate, don't mess with the RGB stuff - at the very least it isn't _broken _is the most I can confirm.


----------



## mattxx88

TK421 said:


> that could be a fix, try it


no way, still stuck
i think ill try a bios flash


----------



## TK421

mattxx88 said:


> no way, still stuck
> i think ill try a bios flash


be sure to save your current bios before flashing


----------



## mattxx88

TK421 said:


> be sure to save your current bios before flashing


Ah sure, i just checked techpowerup BIOS page and i see its uploaded mine version
I try an older one (mine Is the last) 
Thanks for support


----------



## iluvkfc

Hey all,

Which 3080 card would you keep between the *MSI Ventus 3X *and the *EVGA XC3 Ultra*? Is one known to be a better bin than the other? What about VRM quality?
I'm planning to do watercooling, then shunt mod after I run it for a couple months to make sure it's not defective.
By all accounts, the XC3 looks like the better card, but I'm concerned about waterblock availability, it seems like the Ventus has more options, including one from Bykski, which is significantly less expensive.


----------



## lester007

Omar Rana said:


> I had an MSI Ventus 3080 3x which was stuck at 320 W, and honestly produced less heat and noise than this evga. It was 78 euro cheaper, but I returned it within 14 days' windows and got the EVGA RTX 3080 ftw ultra. Now I am regretting it, even though I can move the slider to 450 watts, it never allows me to go past 114 MHz. I have seen other people with the same card going past the slider to 150 easily. I don't know, could be my power supply or my motherboard which is an itx form.Anyways,
> Feel like returning it and wait for new stock and get MSI again.


I have EVGA 3080 FTW3 ultra too, I have offset of +90 in games, but for benchmark and stuff I can go beyond(+150 or so) without crashing.
I'm actually fine with that, I just need a waterblock from EK then I'm set 😁


----------



## Battler624

iluvkfc said:


> Hey all,
> 
> Which 3080 card would you keep between the *MSI Ventus 3X *and the *EVGA XC3 Ultra*? Is one known to be a better bin than the other? What about VRM quality?
> I'm planning to do watercooling, then shunt mod after I run it for a couple months to make sure it's not defective.
> By all accounts, the XC3 looks like the better card, but I'm concerned about waterblock availability, it seems like the Ventus has more options, including one from Bykski, which is significantly less expensive.


Ultra without question, simply because the ventus is limited to 320W by design and cant be changed unless you mod the card itself.


----------



## MRLslidchen

Shunt modded TUF OC with EK waterblock. Power Draw around 420W. Still waiting for my 15 mohm shunt to arrive to mod the PCIe slot one. And need to find a solution to use this card without a riser cable. Its just too tall... 



http://www.3dmark.com/spy/15597378





https://www.3dmark.com/pr/545164



Oh, and using the Asus Thermal Pads from the TUF instead of using of EKs reduced my coil whine a lot. I guess you could also use the Alphacool Eisschicht Thermal Pad as replacement.


----------



## PraiseKek

I have an eta for my gaming X trio of 30/11 finally after ordering in the first hour of launch night, hooray


----------



## mattxx88

seems i''m not the only one with TUF power limit issues (capped 340-350w)
it is also reported by many users in hardwareluxx 









Ein fast perfekter Allrounder: ASUS TUF Gaming GeForce RTX 3080 OC im Test


Das würde die 30W These evtl. untermauern und wäre eine absolut unnötige Einschränkung. Zumal ich meine Karte fast nie unter 30W im Idle sehe (selbst wenn nichts geöffnet ist). Frage mich sowieso, warum das Ding im idle so viel mehr schlucken muss als meine 2080Ti zuvor.




www.hardwareluxx.de


----------



## mattxx88

MRLslidchen said:


> Shunt modded TUF OC with EK waterblock. Power Draw around 420W. Still waiting for my 15 mohm shunt to arrive to mod the PCIe slot one. And need to find a solution to use this card without a riser cable. Its just too tall...
> 
> 
> 
> http://www.3dmark.com/spy/15597378
> 
> 
> 
> 
> 
> https://www.3dmark.com/pr/545164
> 
> 
> 
> Oh, and using the Asus Thermal Pads from the TUF instead of using of EKs reduced my coil whine a lot. I guess you could also use the Alphacool Eisschicht Thermal Pad as replacement.


hi did you use the paint way, stack or solder?


----------



## MRLslidchen

i soldered them piggyback. 5ohm on the two 8pin and 8mohm on those three shunts below. Before that i glued them on the original shunts with high temp silicone to hold them in place (That stuff to glue CPU Heatspreader).


----------



## mattxx88

MRLslidchen said:


> i soldered them piggyback. 5ohm on the two 8pin and 8mohm on those three shunts below. Before that i glued them on the original shunts with high temp silicone to hold them in place (That stuff to glue CPU Heatspreader).


thanks for your post, did you noticed any relevant difference between the 2 soultions?


----------



## MRLslidchen

Just glueing didn‘t work on one shunt. There was no contact or silicone got between the contacts. Maybe i should have put something heavy on top of it to press it down. So just soldered over the contacts. Pretty easy process for my first time


----------



## ZOONAMI

As a tuf OC owner what is the recommended bios to flash for a bit more power???
gigabyte? 

strix? Or does that cause issues because it’s a 3 pin?


----------



## mattxx88

ZOONAMI said:


> As a tuf OC owner what is the recommended bios to flash for a bit more power???
> gigabyte?
> 
> strix? Or does that cause issues because it’s a 3 pin?


looking at the first page in 2 pin cards, tuf have the best
may i ask you, i have a tuf too, can you run a superposition 1080p extreme and tell me what max values you read in gpuz sensors, board power draw?


----------



## MRLslidchen

I managed to get a little more performance with the Aorus Master Bios. But it introduced a boot problem for me with a 7F Q-Code.
The strix bios will work. It might even show 450W (that value is false) but without the third power plug, it's not gonna be able to pull that wattage because it's gonna be capped at 150W, or whatever value is set in the bios, from the 8-pin's.


----------



## mattxx88

MRLslidchen said:


> I managed to get a little more performance with the Aorus Master Bios. But it introduced a boot problem for me with a 7F Q-Code.
> The strix bios will work. It might even show 450W (that value is false) but without the third power plug, it's not gonna be able to pull that wattage because it's gonna be capped at 150W, or whatever value is set in the bios, from the 8-pin's.


interesting i`ll give it a try


----------



## j o e

jesus christ, this card is HUNGRY


----------



## wkdsean88

Rbk_3 said:


> Damn, so I think there is something wrong with my Trio. It is only boosting to max 1950 at stock in Port Royal with the only change being a 102 power limit. Is there something I am maybe doing wrong?
> 
> 
> 
> https://www.3dmark.com/pr/522284
> 
> 
> 
> 
> I tried a manual OC to 2010MHZ at 0.95V and couldn't even get through a Time Spy. It made it though at 0.975 at 1995MHZ. I think I may have lost the lottery.
> 
> 
> 
> https://www.3dmark.com/spy/15445883





https://www.3dmark.com/compare/spy/15501865/spy/15445883



i have the same card i manualy overclocked mine and got it running above 2ghz on timespy. 

your average temps looked a bit high in comparison.


----------



## nam3less

I got my Ventus yesterday. Not my first choice but the only one I was able to get to replace my 1080ti setup. I kept trying to get another 3080 in the mean time but got orders voided by Newegg for the Auros and regular Gigabyte OC. It’s been a ton of effort to just get this far. 

The power limit is a bit of a bummer. Undervolted to .862 @1920mhz. Ruler flat and never dips. +500 memory but I haven’t dialed this in. This never hits the power limiter and still gets to 317w in benchmarks. Without killing any background processes and with a second monitor attached, I got 18100 timespy, 11,500 port royal, temps stay below 70c with custom fan curve of 10% above temp. So 60% fan speed at 50c, 70% at 60c. This is virtually silent compared to AIO fans for CPU cooler. I’m running a 7700k so I’m sure the score could be a bit higher with a better CPU and PCIE4.

I didn’t expect thermal throttling on this card which I thought was weird. It never hits 83c target and yet there’s thermal throttling. I can only think it’s the memory throttling which is why I say I haven’t dialed it in. The error correction is throwing me off. Any ideas why it would thermally throttle? I don’t believe it has memory temp sensors.

I’ve been getting 10% better performance in Witcher 3 over my SLI setup which I didn’t expect. 80ish vs 90ish. Warzone is 100-130FPS. 1080 ran between 60-80 in warzone. I could create profiles per game but that sounds like too much effort for 1fps. For example warzone was using 250w while Witcher 3 used 300.

Overall, I’m content. I’m not an extreme overclocker but I do like to maximize performance. The 3080 is my bridge year. Too many games to play with the PS5 as well. My thoughts are to get rid of SLI for around the same performance at a good price, save my $300-750 instead of a 3080ti or 3090, and then use that toward a 4090 or 4080ti to put under water. The Ventus, despite being the bastard child, performs really well and well within the 3-5FPS difference compared to all other cards.


----------



## ausmisc

Are you just talking where it drops a frequency bin when it goes from say 55c up to 60c? Think that's just a feature of the GPU boost system. It dynamically adjusts your curve depending on the temperature, no way around that one. It's why I do all my curve adjustments on a cold card to try keep it uniform.. could be 30-50mhz difference between a cold and hot curve.


----------



## nam3less

ausmisc said:


> Are you just talking where it drops a frequency bin when it goes from say 55c up to 60c? Think that's just a feature of the GPU boost system. It dynamically adjusts your curve depending on the temperature, no way around that one. It's why I do all my curve adjustments on a cold card to try keep it uniform.. could be 30-50mhz difference between a cold and hot curve.


Yes that’s exactly it. I didn’t see this behavior on my 1080ti. It would get to the target temp always before throttling.


----------



## Belcebuu

Hi guys,
Is there any recommendation of a cheaper 3080 rtx that can get the top oc power limit of 400 or 450w ? can the TUF with only 2x8 pins get up to 400 450w ? Thanks


----------



## Stash

j o e said:


> jesus christ, this card is HUNGRY
> View attachment 2466911


I can hear it screaming from here.


----------



## MRLslidchen

Belcebuu said:


> Hi guys,
> Is there any recommendation of a cheaper 3080 rtx that can get the top oc power limit of 400 or 450w ? can the TUF with only 2x8 pins get up to 400 450w ? Thanks


Get a Trio X and flash the strix, ftw or suprim bios. With the current pricing, the trio might be the cheapest of the 400w+ cards, if you are lucky.


----------



## SPL Tech

MRLslidchen said:


> Get a Trio X and flash the strix, ftw or suprim bios. With the current pricing, the trio might be the cheapest of the 400w+ cards, if you are lucky.


There are no cards available. All cards are scalpter prices on eBay. No retailer anyone is actually selling any of these things. It will probably be 3 months before you can actually get stock at MSRP prices anywhere.


----------



## MRLslidchen

Yeah, i was looking foward to get a 3080 FTW clown edition, but settled with a tuf instead.


----------



## eliwankenobi

Erik9519 said:


> If you could let us know what GPU-Z reports for Board power usage during Furmark and Port Royal with the power slider at default and maxed that would be great. My card for instance being a Inno3D 3080 iChill Frostbite rated for 340W won't use more than 320W in anything but Furmark. Changing bioses as reported by others for certain cards also seems to do absolutely nothing which is very disappointing...


Hello!

Sadly I can confirm it does not go above 320-325 watts during port royal


----------



## parcher

[QUOTE = "ausmisc, post: 28679435, membro: 641628"]
Sì, l'ho mostrato al mio Trinity non-OC senza problemi. Ho testato solo i primi 2 DP perché è tutto quello che uso. Vedere la potenza assorbita fino a un massimo di 370w in panchina. Non so se l'RBG funziona ancora e non potrebbe importare di meno. Entrambi i punteggi grafici sono del 3% superiore per 3080, sembra decente per un 2x8pin "economico".

[URL unfurl = "true"] https://www.3dmark.com/spy/15590271 [/ URL]
[URL unfurl = "true"] https://www.3dmark.com/pr/531935 [/ URL]
[/ CITAZIONE]

*55 °* ?? sei a Liquido ?? : incerto:


----------



## DrMorphine

MRLslidchen said:


> Get a Trio X and flash the strix, ftw or suprim bios. With the current pricing, the trio might be the cheapest of the 400w+ cards, if you are lucky.


But is the falshing worth it? I mean for daily use and benchmarks, you getting 2 % for + 100 Wats + possibly lost warranty (i heard flashing different bios breaks warranty- if something happens ofcourse - is it safe on a card without double bios ? 
I have 3080 X Trio, OC + 40mhz on core and + 500 on VRAM- fully stable in all games, on original bios ( i can + 135 core + 800 VRAM - for benchmarks , but not for stable gaming). Do you really leave it at 450 W for daily use and games? I know benchmarking is fun, but it looks like you trading so much for so little for gaming.


----------



## VPII

DrMorphine said:


> But is the falshing worth it? I mean for daily use and benchmarks, you getting 2 % for + 100 Wats + possibly lost warranty (i heard flashing different bios breaks warranty- if something happens ofcourse - is it safe on a card without double bios ?
> I have 3080 X Trio, OC + 40mhz on core and + 500 on VRAM- fully stable in all games, on original bios ( i can + 135 core + 800 VRAM - for benchmarks , but not for stable gaming). Do you really leave it at 450 W for daily use and games? I know benchmarking is fun, but it looks like you trading so much for so little for gaming.


For gaming it is not really worth it from what I have seen. All three of the RTX 3080 cards I had I have flashed with various bioses and in the end I determined that the stock bios is best. The only bios that really increased my benchmark scores for the MSI RTX 3080 Gaming X Trio was the MSI RTX 3080 Suprim bios. But testing games with this bios did not yield any added performance, my card actually clocked lower using this bios compared to my Gaming X Trio bios.


----------



## Belcebuu

MRLslidchen said:


> Get a Trio X and flash the strix, ftw or suprim bios. With the current pricing, the trio might be the cheapest of the 400w+ cards, if you are lucky.


I read that the trio is not that great only 16 power phases so even if you have the 400w bios you will be limited by your power phases, am I right?

I guess it is the same for the TUF with 20 power phases and then only 2x8 pins ?


----------



## bmgjet

Belcebuu said:


> I read that the trio is not that great only 16 power phases so even if you have the 400w bios you will be limited by your power phases, am I right?
> 
> I guess it is the same for the TUF with 20 power phases and then only 2x8 pins ?


16x50 amp phases.


----------



## VPII

Belcebuu said:


> I read that the trio is not that great only 16 power phases so even if you have the 400w bios you will be limited by your power phases, am I right?
> 
> I guess it is the same for the TUF with 20 power phases and then only 2x8 pins ?


From what I heard, the power phases is more than enough. I saw this with my MSI RTX 3080 Gaming X Trio where flashing the bios with the MSI Suprim bios yielded much better average clocks during benching and the card still running pretty cool depending on ambient temps.


----------



## nam3less

My Ventus is rock steady in games at 1935 @ 0.9mv. It stays at 65c. I had some crashing issues at lower power. This is a good balance where it hovers around 280-300 watts with occasional spikes to power limit and I mean occasional. It clocks down to 1920 for a split second then right back up. This was tested in Witcher 3 ultra with extra textures and visual mods and Warzone. 3840x1600 resolution in both. Warzone uses 250w.

I didn’t notice any difference in performance in either game with any of the OCs I tried. They all had the same frame rate and the benchmarks was where I got the 2% extra. For example I can run port royal at 1980 MHz but timespy crashes and so do games unless I up the voltage but then I’m constantly at the limiter. The card will do 2160 but it’ll hit the limiter. It’s nice for benchmarks to see an average above 2ghz but it’s definitely not as smooth of an experience in games as a flat line.

I’ll report back when cyberpunk comes out. I hope that doesn’t smash me against the limiter and I have to lower clocks. I’m not playing at 4k so I have some room to go down.


----------



## rioja

Anything good to tell about Gigabyte Waterforce 3080









AORUS GeForce RTX™ 3080 XTREME WATERFORCE WB 10G (rev. 1.0) Key Features | Graphics Card - GIGABYTE Global


Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!




www.gigabyte.com




?


----------



## squadz

Any reason to keep the Ventus 3080 OC over TUF 3080 OC if the TUF 3080 OC was cheaper? 

I know my Ventus was hardware limited for OC, Asus doesn't have that problem? Anything else?


----------



## BluemoonRisen

ausmisc said:


> Yeah flashed it to my Trinity non-OC no issues. Have only tested the first 2 DP as that's all I use. Seeing power draw up to a max of 370w in benching. Don't know if the RGB still works and couldn't care less. Both graphic scores are top 3% for 3080, seems decent for a "cheap" 2x8pin.
> 
> 
> 
> https://www.3dmark.com/spy/15590271
> 
> 
> 
> 
> https://www.3dmark.com/pr/531935


Thank you i flashed it and it "works" i guess.

My score went up and i have the same score as you.

It seems the BIOS is more stable than the Trinity one.

But it can´t get over 100% Power Limit and stays at 330W max. Board Power Draw or am i missing something?

It doesn´t matter if i slide it to 110%. Maybe 6W more


----------



## eliwankenobi

I was gonna ask the same thing. But in regards to the EVGA 3080 XC3 Ultra. It doesn’t go above 320watts

That 320w is the limit for reference design while the FE has a 370 limit and the reason why FE tends to have higher scores. Which is very disappointing for a card that has a $70 up-charge vs the FE MSRP


----------



## saar

ok i sold my zotac 3080 and got new auros xtreme


----------



## saar

time spy stock


----------



## SPL Tech

These cards are a scam. No one actually sells them. They are just overpriced scalpters on eBay selling them. I've been trying to get one for months and cant--because they dont exist. Fake product, fake news.  Nvidia screwed this up beyond any possible measure. Like hey, we have this new card out, but there are only 100 for sale and after that we are moving onto the 4000 series cards for 2022. LOL


----------



## Falkentyne

oops, thought this was the 3090 thread. 370w max for 3080, 400w for 3090...


----------



## saar

auros xtreme always been overpriced i had 1080ti auros xtreme waterfoce he was overpriced too :/


----------



## eliwankenobi

saar said:


> time spy stock
> View attachment 2467185


Wow! Very nice! What a higher power limit and sufficient cooling can give you!

This is my best score so far. I am starting to be frustrated by the 320 watt pwr limit


----------



## rioja

saar said:


> ok i sold my zotac 3080 and got new auros xtreme


Finally proper 3x8-pin power with decent power stages, I really hope it has proper realisation as there are no many other options actually considering that

ASUS Strix has huge coil whines
EVGA never sold in our region
MSI Trio has less stages
Palit Gamerock has no waterblock compatible

So Aorus Exteme maybe the only option to hunt on 

Just tell me please, doesn’t it have coil whine?

By coil whine I mean such high level as with TUF/Strix


----------



## ssgwright

annoying? that's nothing! hahahaaa


----------



## rioja

Btw reg those case on the video, after one month of waiting Asus regional office confirmed it is warranty case and will write the card off (or how to say)


----------



## BluemoonRisen

So with 330W max. and +180 core +1000 mem thats the maximum i can get with the 3080 Trinity on water and Amp Holo BIOS.


----------



## mardon

eliwankenobi said:


> Wow! Very nice! What a higher power limit and sufficient cooling can give you!
> 
> This is my best score so far. I am starting to be frustrated by the 320 watt pwr limit


I'm confused.. Why am I busting my balls trying to get a 3080, that score is hardly any better than my 2080ti?

Am I missing something? 



https://www.3dmark.com/spy/14835421


----------



## mardon

BluemoonRisen said:


> So with 330W max. and +180 core +1000 mem thats the maximum i can get with the 3080 Trinity on water and Amp Holo BIOS.
> View attachment 2467232


Sorry should have read further. Thats looking a bit better. 370w BIOS not work?


----------



## asdkj1740

rioja said:


> Anything good to tell about Gigabyte Waterforce 3080
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AORUS GeForce RTX™ 3080 XTREME WATERFORCE WB 10G (rev. 1.0) Key Features | Graphics Card - GIGABYTE Global
> 
> 
> Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!
> 
> 
> 
> 
> www.gigabyte.com
> 
> 
> 
> 
> ?


no, this is actually having the master pcb (dual 8pins) intead of the xtreme pcb with triple 8pins. and the power limit is 370w max just like the air master.


----------



## Shadowdane

mardon said:


> I'm confused.. Why am I busting my balls trying to get a 3080, that score is hardly any better than my 2080ti?
> 
> Am I missing something?
> 
> 
> 
> https://www.3dmark.com/spy/14835421


Well you got a very fast overclock on both your CPU & GPU. Looks like you got super lucky on your CPU & GPU with those clocks your hitting.

My 2080Ti won't clock higher than ~2065Mhz get driver crashes when it boosts much higher than that. Not to mention my 9900K is a serious dud for overclocking fastest I can manage is 4.9Ghz all-core to keep temps below 90C anyway. I can get it to 5Ghz but see temps in the mid-90C which is way too hot for me. 

This is probably more typical of a 2080Ti score for most people:


https://www.3dmark.com/spy/14623867


----------



## mardon

Shadowdane said:


> Well you got a very fast overclock on both your CPU & GPU. Looks like you got super lucky on your CPU & GPU with those clocks your hitting.
> 
> My 2080Ti won't clock higher than ~2065Mhz get driver crashes when it boosts much higher than that. Not to mention my 9900K is a serious dud for overclocking fastest I can manage is 4.9Ghz all-core to keep temps below 90C anyway. I can get it to 5Ghz but see temps in the mid-90C which is way too hot for me.
> 
> This is probably more typical of a 2080Ti score for most people:
> 
> 
> https://www.3dmark.com/spy/14623867


I had similar results on the 2080ti until I water cooled it. They love being below 50c.

So possibly not worth an upgrade this gen then?
I'd be interested in seeing water cooled 370w results for the 3080 to see what the difference is then. I've got the upgrade itch!!

Yes very lucky with the 9900ks. EBay £350!! There was another on there at 5.6ghz finishing a day later. I think everyone was holding out for that!!


----------



## Shadowdane

mardon said:


> I had similar results on the 2080ti until I water cooled it. They love being below 50c.
> 
> So possibly not worth an upgrade this gen then?
> I'd be interested in seeing water cooled 370w results for the 3080 to see what the difference is then. I've got the upgrade itch!!
> 
> Yes very lucky with the 9900ks. EBay £350!! There was another on there at 5.6ghz finishing a day later. I think everyone was holding out for that!!


Yah when I saw the temps listed in your 3dmark i knew you were on H2O. I've debated doing that but I've never touched water cooling outside of AIO coolers and it scares me. LOL


----------



## mardon

Shadowdane said:


> Yah when I saw the temps listed in your 3dmark i knew you were on H2O. I've debated doing that but I've never touched water cooling outside of AIO coolers and it scares me. LOL


I originally was on a Kraken G12 with a 120mm with good results (Max 58C overclocked). The full custom loop has brought it down under 50C now and has allowed for higher locked clocks. Unfortunately my case is SFF so i'm restricted to a certain pool of 3080 cards due to the width of the card. I'm interested to see how the reference boards (which don't protrude past the IO bracket) perform against my current setup to gauge whether an upgrade is worth it. I bought a Zotac 3090 but got buyers remorse and cancelled when I saw the 10% difference at double cost.


----------



## Zemo

Anyone with Gigabyte RTX 3080 Master (non Extreme), can you share your experience with the model?
just bought one as I couldn't snatch the TUF seems like a good card.


----------



## ViTosS

Just received my FTW3 Ultra, I installed EVGA Precision X and it prompted a firmware update, I accepted, my boost clock at 4k gaming stock is about 1950-1980Mhz, is that good? Also how can I know if I have already the best BIOS, the one with power limit 450W and stuff? For now should I stick with MSI AB 4.62? Also I noticed the voltage monitor/control is locked in MSI AB and doesn't work even if I set to enable it, while in EVGA Precision X is available, is it safe to increase it till 100 (it means 100mv just like the other cards and I assume it's safe, right)?

Thanks!


----------



## sakete

ViTosS said:


> Just received my FTW3 Ultra, I installed EVGA Precision X and it prompted a firmware update, I accepted, my boost clock at 4k gaming stock is about 1950-1980Mhz, is that good? Also how can I know if I have already the best BIOS, the one with power limit 450W and stuff? For now should I stick with MSI AB 4.62? Also I noticed the voltage monitor/control is locked in MSI AB and doesn't work even if I set to enable it, while in EVGA Precision X is available, is it safe to increase it till 100 (it means 100mv just like the other cards and I assume it's safe, right)?
> 
> Thanks!


To get the 450W power limit BIOS, you'll need to install the beta BIOS linked on the EVGA forums. What comes pre-installed with the card isn't the high power limit version.


----------



## bmgjet

ViTosS said:


> Just received my FTW3 Ultra, I installed EVGA Precision X and it prompted a firmware update, I accepted, my boost clock at 4k gaming stock is about 1950-1980Mhz, is that good? Also how can I know if I have already the best BIOS, the one with power limit 450W and stuff? For now should I stick with MSI AB 4.62? Also I noticed the voltage monitor/control is locked in MSI AB and doesn't work even if I set to enable it, while in EVGA Precision X is available, is it safe to increase it till 100 (it means 100mv just like the other cards and I assume it's safe, right)?
> 
> Thanks!


Voltage slider isnt a mv anymore.
Its a percentage of the VF curve, But by default Nvidia boost will take the VF curve upto max allowed voltage anyway if temps and power allow so.
So the difference between 0 and 100 will be holding max voltage allowed in the bios for maybe 1 sec longer before temps and power limit drag the curve back down.


----------



## ViTosS

sakete said:


> To get the 450W power limit BIOS, you'll need to install the beta BIOS linked on the EVGA forums. What comes pre-installed with the card isn't the high power limit version.





bmgjet said:


> Voltage slider isnt a mv anymore.
> Its a percentage of the VF curve, But by default Nvidia boost will take the VF curve upto max allowed voltage anyway if temps and power allow so.
> So the difference between 0 and 100 will be holding max voltage allowed in the bios for maybe 1 sec longer before temps and power limit drag the curve back down.


I see, but does that BIOS in EVGA forums can make the stock boost more solid? I mean, I see a lot of fluctuation in my boost, specially when OCed to 2050Mhz, it changes from 2025-2050 sometimes 2010Mhz without temperature changing, is that normal behavior?


----------



## Micko

Crysis 3 is the most sensitive game i have regarding GPU overclock. 0.9v/1950MHz was stable in every other game, including Control, but in Crysis 3 i had to lower the core frequency 2 notches down to 1920MHz to keep it from crashing to desktop.


----------



## DStealth

Anyone with Asus tested SAM with 3080 ? 








Asus Brings Resizable BAR Support to Z490 Motherboards


Free performance for Intel owners too




www.tomshardware.com


----------



## Nizzen

DStealth said:


> Anyone with Asus tested SAM with 3080 ?
> 
> 
> 
> 
> 
> 
> 
> 
> Asus Brings Resizable BAR Support to Z490 Motherboards
> 
> 
> Free performance for Intel owners too
> 
> 
> 
> 
> www.tomshardware.com


I think we are waiting for an nVidia driver update to support "SAM" on 3070/3080/3090 cards 

6800xt with "SAM" should now work on z490 with this bios update.


----------



## eliwankenobi

mardon said:


> I'm confused.. Why am I busting my balls trying to get a 3080, that score is hardly any better than my 2080ti?
> 
> Am I missing something?
> 
> 
> 
> https://www.3dmark.com/spy/14835421


Yeah, well, my CPU is a 3800x. Although well tuned and running 3800mhz CL16 memory, it still lags behind an OC’d 9900k. My 3080 score still hasn’t reached above 18k (getting close though, with +135 core and +550 on vram) which is what I’ve seen from other reviewers and users around and seems comparable. I can see the card trying to go higher though, and it’s limited by hard cap that won’t let it go past ~325watts. Maybe an 40watts in the budget would do the trick!

Still, I’m coming from a 1080ti so for me the perf uplift is very noticeable. 

This is my Shadow of Tomb Raider score at 1440p ULTRA settings with RT OFF. That’s pretty good I believe.


----------



## VPII

eliwankenobi said:


> Yeah, well, my CPU is a 3800x. Although well tuned and running 3800mhz CL16 memory, it still lags behind an OC’d 9900k. My 3080 score still hasn’t reached above 18k (getting close though, with +135 core and +550 on vram) which is what I’ve seen from other reviewers and users around and seems comparable. I can see the card trying to go higher though, and it’s limited by hard cap that won’t let it go past ~325watts. Maybe an 40watts in the budget would do the trick!
> 
> Still, I’m coming from a 1080ti so for me the perf uplift is very noticeable.
> 
> This is my Shadow of Tomb Raider score at 1440p ULTRA settings with RT OFF. That’s pretty good I believe.


With or without DLSS? Just asking


----------



## BluemoonRisen

An Update to the ZOTAC RTX 3080 Amp Holo.

Maximum Power is 340W even with the 10% Power Limit, so that is why my Trinity can not get 340W above that.

I just talked to Zotac support and they said it is capped at 340W and the +10% Power Limit function does not work for the Amp Holo Cards.


----------



## reflex75

mardon said:


> I'm confused.. Why am I busting my balls trying to get a 3080, that score is hardly any better than my 2080ti?
> 
> Am I missing something?
> 
> 
> 
> https://www.3dmark.com/spy/14835421


Yes, you are missing something: you are targeting the wrong card.
Coming from a 2080ti your path is 3090, not 3080.
And Ampere can bring you at least 30% better performance vs Turing:
https://www.3dmark.com/compare/spy/14835421/spy/15738613


----------



## zhrooms

xermalk said:


> The "375W PL is Bs ,and others have the exact same issue when i search around.
> The slider can go that high, the card refuses. And shows power limit 1 whenever it passes 345.
> 
> First page should be changed until someone actually proves it can do 375W.


No, because it's a temporary issue, they just need to release an updated BIOS. No idea when that might happen but it's fairly safe to say they'll get to it eventually.


----------



## zhrooms

reflex75 said:


> Yes, you are missing something: you are targeting the wrong card.
> Coming from a 2080ti your path is 3090, not 3080.
> And Ampere can bring you at least 30% better performance vs Turing:
> https://www.3dmark.com/compare/spy/14835421/spy/15738613


Yes and no, it's actually 3080 Ti/SUPER, which is unreleased (to counter 6900 XT), and it'll feature 20GB (10GB each side) of memory, and increased CUDA Core count, basically 3090 "Light", meaning 3090 is the Titan (slight upgrade to Ti).

3080 is only 8-11% faster than 2080 Ti when compared overclocked to overclocked, hence 3080 is "barely" an upgrade. Sidegrade if anything since you get slightly higher RT performance and HDMI 2.1, that's about it, no one cares about going from 100 to 108 FPS for a few hundred $.


----------



## ssgwright

10% more performance is a "sidegrade"? lol and at half the price hahahaaaa


----------



## c0nsistent

I own both and I've tested them on a 3600x. My 2080 Ti overclocks over 2100mhz and is power unlocked (XOC). When completely maxed out it scores within 5% of a completely barebones 3080. When I OC the 3080, even power limited, that goes up to roughly 12-15%. When my shunts get here, that will likely move to 20%. I'll be grabbing a 3090 in march if the 3080 ti isn't out yet.

** And yes, I'm testing in GPU limited scenarios such as Superposition 4k Optimized.


----------



## lordzed83

There ya go made vid cause there is literary 0 copntent about em blocks also got alphacool one coming this or next week so will have hands on comparisson. One stays other sold.


----------



## lordzed83

when i get off playing Raytraced wow ill go get it installed


----------



## Warrimonk

So, besides shunt modding there is no way to increase the Power Limit past 107% on the EVGA 3080 XC3 Ultra?
I saw some posts and videos about BIOS flashing, but it seems without the shunt mod all they do is decrease overall power limit..

Kind of disappointed since I am expecting a XC3 Ultra 3080 in the mail. It is the only one I could get my hands on. Seems like a rather lackluster card in pretty much all regards.


----------



## Alemancio

Warrimonk said:


> So, besides shunt modding there is no way to increase the Power Limit past 107% on the EVGA 3080 XC3 Ultra?
> I saw some posts and videos about BIOS flashing, but it seems without the shunt mod all they do is decrease overall power limit..
> 
> Kind of disappointed since I am expecting a XC3 Ultra 3080 in the mail. It is the only one I could get my hands on. Seems like a rather lackluster card in pretty much all regards.


Have you tried undervolting? Im using a FTW3 at 2100MHz @ 1025mV and it barely peaks at 380W


----------



## Warrimonk

Alemancio said:


> Have you tried undervolting? Im using a FTW3 at 2100MHz @ 1025mV and it barely peaks at 380W


I don't have the card yet, it is expected to arrive tomorrow. The problem with the XC3 Ultra is that it seems to be limited to 340 Watts. That's 40 less than your FTW3 peaks at, and reports are that it seems to hinder the card quite a bit.


----------



## magnetik

12161 Port Royal with Intel i5 2500K. Yes, you read that correctly.

EVGA 3080 FTW3 Gaming (non Ultra) @ 2055MHz core / 1188 MHz mem

I feel like I have a special chip on my hands because it's beating all other 2500K/3080 pairs on Port Royal.


----------



## eliwankenobi

VPII said:


> With or without DLSS? Just asking


Without DLSS/RayTracing


Sent from my iPhone using Tapatalk


----------



## eliwankenobi

Warrimonk said:


> So, besides shunt modding there is no way to increase the Power Limit past 107% on the EVGA 3080 XC3 Ultra?
> I saw some posts and videos about BIOS flashing, but it seems without the shunt mod all they do is decrease overall power limit..
> 
> Kind of disappointed since I am expecting a XC3 Ultra 3080 in the mail. It is the only one I could get my hands on. Seems like a rather lackluster card in pretty much all regards.


I also have a 3080 xc3 ultra. Got it last friday. Can confirm the power limit you mention. Now I’m pondering on flashing the BIOS with the FTW XOC. It supposedly gives you an extra 50watts according to Frame Chasers. He did a lot of experiments on the XC3 Ultra. 


Sent from my iPhone using Tapatalk


----------



## Alemancio

Warrimonk said:


> I don't have the card yet, it is expected to arrive tomorrow. The problem with the XC3 Ultra is that it seems to be limited to 340 Watts. That's 40 less than your FTW3 peaks at, and reports are that it seems to hinder the card quite a bit.


That doesnt matter coz you wont expect to run 2100MHz on your XC3. At 2000MHz and 950mV you'd reach maybe 300W? so no power limit ;-)


----------



## Warrimonk

eliwankenobi said:


> I also have a 3080 xc3 ultra. Got it last friday. Can confirm the power limit you mention. Now I’m pondering on flashing the BIOS with the FTW XOC. It supposedly gives you an extra 50watts according to Frame Chasers. He did a lot of experiments on the XC3 Ultra.
> 
> 
> Sent from my iPhone using Tapatalk


People already ripped apart that video in this forum.... it works but ONLY if you do the shunt mod first... FrameChasers had already done the shunt mod before flashing the FTW3 450W bios.. people that tested and hadnt done the shunt mod actually had a power limit loss.


----------



## ViTosS

Well... apparently my FTW3 Ultra is garbage level in OC, I can't get it over 2100Mhz no matter what voltage I try, also I don't see the card using more than 1.075v in load, I tried to do a custom voltage curve in MSI AB and tried to make it hit 1.10v and more and it keeps getting maxed at 1.075v, is it normal? Also tried to unlock the voltage adding +100 in MSI AB, it made it go till 1.10v but still wasn't enough to make my card 2100Mhz+ stable, so I just gave up and settled for 0.950v fixed and 1980Mhz boost locked, should I try the EVGA forum BIOS for XOC or that won't change my lottery? Thanks!


----------



## SoldierRBT

Lower ambient temp. Still on air. 



https://www.3dmark.com/pr/579484


Graphics Score 133nice



https://www.3dmark.com/dxr/27224


----------



## ssgwright

SoldierRBT said:


> Lower ambient temp. Still on air.
> 
> 
> 
> https://www.3dmark.com/pr/579484
> 
> 
> Graphics Score 133nice
> 
> 
> 
> https://www.3dmark.com/dxr/27224


 you're telling me you achieved 2265mhz or air? I'm calling shenanigans


----------



## Stash

SoldierRBT said:


> Lower ambient temp. Still on air.
> 
> 
> 
> https://www.3dmark.com/pr/579484
> 
> 
> Graphics Score 133nice
> 
> 
> 
> https://www.3dmark.com/dxr/27224


Naughty, what's the voltage?


----------



## ausmisc

BluemoonRisen said:


> But it can´t get over 100% Power Limit and stays at 330W max. Board Power Draw or am i missing something?


Others over on the ZOTAC Reddit have had the same issue.. can't seem to get their Trinity to draw above 340W with the Amp bios. I don't know if it's a hardware limitation to do with load balancing or what.. can't explain why mine sees up to 370.


----------



## SoldierRBT

Stash said:


> Naughty, what's the voltage?


I’m still using 1.056v locked for Port Royal to avoid hitting PL (450W). I can use 1.062v too but for some reason the card downclocks from 2265MHz straight to 2190-2175MHz at that voltage giving me lower scores in PR. I double checked the wattage and voltage and they’re fine at 1.062v it’s just the core doesn’t like that voltage bin idk.



ssgwright said:


> you're telling me you achieved 2265mhz or air? I'm calling shenanigans


Yeah the card is on stock cooler with the 450W beta BIOS. I already did a video a month ago running Port Royal at 2220-2205MHz avg (2250MHz max) 21C ambient. An extra +15MHz for 7C less shouldn't be that hard to believe.


----------



## rankftw

New BIOS for Palit Gaming Pro and OC came out yesterday. Power limit cap is still 350w and can't see any difference in clocks.


----------



## Alemancio

ViTosS said:


> Well... apparently my FTW3 Ultra is garbage level in OC, I can't get it over 2100Mhz no matter what voltage I try, also I don't see the card using more than 1.075v in load, I tried to do a custom voltage curve in MSI AB and tried to make it hit 1.10v and more and it keeps getting maxed at 1.075v, is it normal? Also tried to unlock the voltage adding +100 in MSI AB, it made it go till 1.10v but still wasn't enough to make my card 2100Mhz+ stable, so I just gave up and settled for 0.950v fixed and 1980Mhz boost locked, should I try the EVGA forum BIOS for XOC or that won't change my lottery? Thanks!


LOL, dude 3080s rarely hit 2100+, relax. Also dont trust half the people here saying they run 2100+ without proof. Ive tested 6 FTW3s and other 3080s and only 1 or 2 do 2100MHz on air stable. The vast majority do 2025 or 2050 and thats it.


----------



## ssgwright

SoldierRBT said:


> I’m still using 1.056v locked for Port Royal to avoid hitting PL (450W). I can use 1.062v too but for some reason the card downclocks from 2265MHz straight to 2190-2175MHz at that voltage giving me lower scores in PR. I double checked the wattage and voltage and they’re fine at 1.062v it’s just the core doesn’t like that voltage bin idk.
> 
> 
> 
> Yeah the card is on stock cooler with the 450W beta BIOS. I already did a video a month ago running Port Royal at 2220-2205MHz avg (2250MHz max) 21C ambient. An extra +15MHz for 7C less shouldn't be that hard to believe.


what card and bios are you running again?


----------



## SoldierRBT

EVGA RTX 3080 FTW3 Ultra with 450W BIOS


----------



## ssgwright

ah damn... wish i picked up a 3 pin card


----------



## BluePaint

@ssgwright
Well, from everything I have seen, his chip is simply in it's own league, achieving those frequencies at those temps. 
Polarfrog needed about 25 degree lower average temps than SoldierRBT to achieve minimally higher average frequency in Port Royal, both with Intel CPU @ 5.4Ghz:
https://www.3dmark.com/compare/pr/547536/pr/579484#


----------



## knightriot

Hi guy ,I just got an aorus waterforce 3080, but it limited at 370W. What bios i can use for more power?


----------



## Sys0p

Here's my 3dMark numbers on a water cooled Asus TUF 3080 OC. 

Port Royal - 12,605 https://www.3dmark.com/pr/579916

Time Spy - 17,256 https://www.3dmark.com/spy/15833595

What I found interesting is how much different workloads impact overclock. In ray trace heavy Port Royal my OC is configured for 2115 MHz @ 950mV and +1100 mem, but this is not stable in Time Spy where I had to up voltage to 975mV in order to be stable at 2100 MHz and drop mem to +1000. 

Both of these are running at 100% TDP in GPUz despite what I set power limit to in Afterburner.

Similarly when running Blender workloads, no matter what voltage / clock speeds I configure in Afterburner the GPU won't go above about 80% TDP (although even with clock increased to 2191 MHz render time doesn't decrease by a significant amount).

For daily OC and gaming I'm running 1920 MHz at 875 mV and +500 mem which runs Watch Dogs Legion at 4K 60 fps with ray tracing on and graphics settings maxed out and GPU never goes above 40c.

Would love to see Asus release a bios for this board that allows it to hit the actual 110% TDP limit configured.


----------



## mardon

reflex75 said:


> Yes, you are missing something: you are targeting the wrong card.
> Coming from a 2080ti your path is 3090, not 3080.
> And Ampere can bring you at least 30% better performance vs Turing:
> https://www.3dmark.com/compare/spy/14835421/spy/15738613


This is useful thanks. I actually purchased a Zotac 3090 but bottled it and cancelled my order. I'm not paying for a load fo Vram i'll never use. I'll ether wait for the 3080ti or next gen.


----------



## xermalk

zhrooms said:


> No, because it's a temporary issue, they just need to release an updated BIOS. No idea when that might happen but it's fairly safe to say they'll get to it eventually.


Iv actually tried some of the other 370w Bioses on my TUF, and the power usage is stil just about identical. I'm kinda wondering it its only reporting the power from the pci-e plugs. if you add 21w to what its reporting you get awfully close to the 375w or 370w total.


----------



## lordzed83

Water block nr2. I opresent You 3080 FE Alphacool block Got one of first ones made !!!!!


----------



## sblantipodi

is there someone here who succeded buying a 3080 FE here?


----------



## StreaMRoLLeR

Ok My score is very low given my values. 2190-2160 ( more like 2175 hold) fans 100 voltage 100 and ambient near 21C. Do i need more CPU speed and more RAM speed ? 9900k 5.1 and CL17 4000mhz.


----------



## ausmisc

parcher said:


> *55 °* ?? sei a Liquido ?? : incerto:


Nah just with fans at 100%.


----------



## lordzed83

sblantipodi said:


> is there someone here who succeded buying a 3080 FE here?


besides me ?? i had 3x3080 so far


----------



## ViTosS

Alemancio said:


> Have you tried undervolting? Im using a FTW3 at 2100MHz @ 1025mV and it barely peaks at 380W


Which software did you use to undervolt? If it was MSI AB 4.63 Beta 2, is it like this the way to undervolt?










In my case I want always fixed 0.950v and 1980Mhz boost.


----------



## rioja

Sys0p said:


> water cooled Asus TUF 3080 OC


which waterblock do you have?


----------



## Erik9519

knightriot said:


> Hi guy ,I just got an aorus waterforce 3080, but it limited at 370W. What bios i can use for more power?


I don't think there's a better bios given the default is 370W. I believe there is a bios for a 2x8pin card that allows up to 375W but I don't think it's worth it to crossflash for 5 watts.

I myself recently got my Inno3D iChill Frostbite returned to swap it with a Aorus Waterforce WB. What's the maximum power usage you see and gaming stable overclock?
Mine doesn't cross 350W and can overclock to +110 core , +1200 memory. It's stable in COD Cold War, Mass Effect Andromeda and Star Wars Squadrons.


----------



## Sys0p

rioja said:


> which waterblock do you have?


EK


----------



## eliwankenobi

Can somebody point me to a guide on how to undervolt the 3080? Try to find that sweetspot?

I’d like also to learn how to set a fixed voltage so as to prevent the card from “over speeding” I guess. Like I’m running a demanding task like Port Royal and then when it ends it shoots from 1950 to 2070mhz and then back. I believe this was what was causing the crashes before and why they neutered our power limits


----------



## Erik9519

I've been doing some fairly extensive overclocking on my 3080 xtreme waterforce wb and these so far have been my results. Boost base for my card is 1845MHz and I've managed +160 offset capped at 1050mV and +1260 on the memory. The +160 cap at 1050mV is because anything over 2175MHz would crash in Fire Strike Combined test whilst the memory over +1260MHz would make for a split second scrolling green bars in Fire Strike Graphic Test 1.
It seems like Fire Strike is the quickest way to find out if an overclock is stable or not and that Port Royal will pass pretty much everything.



Spoiler: Benchmark results (Graphics score only listed)



Fire Strike 44193
Fire Strike Extreme 22472
Fire Strike Ultra 11820
Port Royal 12505
Time Spy 19412
Time Spy Extreme 9740
Unigine Superposition (4K Optimized) 16025



EDIT: To be COD Cold War stable I had to further reduce peak frequency to 2160MHz.


----------



## eliwankenobi

Erik9519 said:


> I've been doing some fairly extensive overclocking on my 3080 xtreme waterforce wb and these so far have been my results. Boost base for my card is 1845MHz and I've managed +160 offset capped at 1050mV and +1260 on the memory. The +160 cap at 1050mV is because anything over 2175MHz would crash in Fire Strike Combined test whilst the memory over +1260MHz would make for a split second scrolling green bars in Fire Strike Graphic Test 1.
> It seems like Fire Strike is the quickest way to find out if an overclock is stable or not and that Port Royal will pass pretty much everything.
> 
> 
> 
> Spoiler: Benchmark results (Graphics score only listed)
> 
> 
> 
> Fire Strike 44193
> Fire Strike Extreme 22472
> Fire Strike Ultra 11820
> Port Royal 12505
> Time Spy 19412
> Time Spy Extreme 9740
> Unigine Superposition (4K Optimized) 16025


Good suggestion! Will try with that instead of Port Royal


----------



## StreaMRoLLeR

Erik9519 said:


> I've been doing some fairly extensive overclocking on my 3080 xtreme waterforce wb and these so far have been my results. Boost base for my card is 1845MHz and I've managed +160 offset capped at 1050mV and +1260 on the memory. The +160 cap at 1050mV is because anything over 2175MHz would crash in Fire Strike Combined test whilst the memory over +1260MHz would make for a split second scrolling green bars in Fire Strike Graphic Test 1.
> It seems like Fire Strike is the quickest way to find out if an overclock is stable or not and that Port Royal will pass pretty much everything.
> 
> 
> 
> Spoiler: Benchmark results (Graphics score only listed)
> 
> 
> 
> Fire Strike 44193
> Fire Strike Extreme 22472
> Fire Strike Ultra 11820
> Port Royal 12505
> Time Spy 19412
> Time Spy Extreme 9740
> Unigine Superposition (4K Optimized) 16025
> 
> 
> 
> EDIT: To be COD Cold War stable I had to further reduce peak frequency to 2160MHz.


What would happen if you let card eat 1.100mV ? With voltage maxed out. 2175 is near limit for our cards. Above 2175 ( 2190-2205-2220) is best bins anyway. So yours can push little further i think


----------



## rioja

Del


----------



## StreaMRoLLeR

eliwankenobi said:


> Can somebody point me to a guide on how to undervolt the 3080? Try to find that sweetspot?
> 
> I’d like also to learn how to set a fixed voltage so as to prevent the card from “over speeding” I guess. Like I’m running a demanding task like Port Royal and then when it ends it shoots from 1950 to 2070mhz and then back. I believe this was what was causing the crashes before and why they neutered our power limits



First open AB then write -280. 

Start with .900mV with 1950 upto 1995 

U just locate the .900mV from bottom then raise the box accordingly. I would say .900mV is sweet spot


----------



## Tyler Dalton

SoldierRBT said:


> Lower ambient temp. Still on air.
> 
> 
> 
> https://www.3dmark.com/pr/579484
> 
> 
> Graphics Score 133nice
> 
> 
> 
> https://www.3dmark.com/dxr/27224


Finally tracked you down, so your the one I have no hope of catching for fastest in the US lol. I did manage to get to 13201 in Port Royal, ignore the avg clock cause for some reason it's wrong, I had lower scores that had a much higher avg clock. https://www.3dmark.com/pr/576202


----------



## StreaMRoLLeR

Tyler Dalton said:


> Finally tracked you down, so your the one I have no hope of catching for fastest in the US lol. I did manage to get to 13201 in Port Royal, ignore the avg clock cause for some reason it's wrong, I had lower scores that had a much higher avg clock. https://www.3dmark.com/pr/576202


Wow another very good score.

Yours is uV too or water ? Did u make msconfig tweaks ?


----------



## Tyler Dalton

Air with the window open, room temp was down to about 60F. Not really undervolted but I did mess with the curve. No msconfig tweaks. I said I was done trying to push it higher but I decided to try one more thing and give my curve a boost in the 1.062v mark and after failing the first time, I got it to pass the second time with a new high score of 13246. Without getting the room A LOT colder I don't think I can get it any higher though. https://www.3dmark.com/3dm/54123418


----------



## AngEv1L

Who can help me with good bios for my 3080 SG?
Power limit 320wt, but I need mb 350 or more for use with waterblock


----------



## Krisztias

sblantipodi said:


> is there someone here who succeded buying a 3080 FE here?


Yes, from Germany. I have an Alphacool Waterblock on the way too.


----------



## DStealth

AngEv1L said:


> Who can help me with good bios for my 3080 SG?
> Power limit 320wt, but I need mb 350 or more for use with waterblock


No one. Shunt mod is the only way.


----------



## AngEv1L

DStealth said:


> No one. Shunt mod is the only way.


Rly? Ok. I flash palit gamingpro oc bios, card now can boost more than 340w, but not 370 with 109 pl. This batter than stock and ok for stock cooler, but with WB I think I need shunt. Do you have easyer version for shunt? Videos or instruction?
Thanks


----------



## mattxx88

DStealth said:


> No one. Shunt mod is the only way.


still it might be not enough

i got mine 3080 TUF shunted perfectly (looking at gpuz values) 


















but it still get PL cut, dunno what to think at this point


----------



## Stash

mattxx88 said:


> still it might be not enough
> 
> i got mine 3080 TUF shunted perfectly (looking at gpuz values)
> 
> View attachment 2467682
> View attachment 2467684
> 
> 
> 
> but it still get PL cut, dunno what to think at this point


GPU chip limit is 180W iirc, so that is likely your cap.


----------



## mattxx88

Stash said:


> GPU chip limit is 180W iirc, so that is likely your cap.


left screen is clearly without mod, the right one is shuntmodded and after shunt report a chip power draw of 140w


----------



## Stash

mattxx88 said:


> left screen is clearly without mod, the right one is shuntmodded and after shunt report a chip power draw of 140w


I misread. 😅


----------



## mattxx88

Stash said:


> I misread. 😅


np 

anyway as i said, i think there is something else regulating the power


----------



## StreaMRoLLeR

Ok. Some fine tweaking and Cold Boot ambient 11C. I think i entered top league on air <3

















Result not found







www.3dmark.com


----------



## SoldierRBT

Tyler Dalton said:


> Air with the window open, room temp was down to about 60F. Not really undervolted but I did mess with the curve. No msconfig tweaks. I said I was done trying to push it higher but I decided to try one more thing and give my curve a boost in the 1.062v mark and after failing the first time, I got it to pass the second time with a new high score of 13246. Without getting the room A LOT colder I don't think I can get it any higher though. https://www.3dmark.com/3dm/54123418


Nice result! Your card in water should be able to pass my score. I’m still not sure if I watercool it or wait for the RTX 3080 Ti


----------



## Tyler Dalton

SoldierRBT said:


> Nice result! Your card in water should be able to pass my score. I’m still not sure if I watercool it or wait for the RTX 3080 Ti


Maybe, my problem is I run into a power wall. Interesting story, this is my original 3080, but EVGA sent me a replacement cause I had a defect with my lightbar. The replacement could go higher on the core than my current one, but anything above +800 on memory and it fell on it's face. It also drew 20w or so less power at the same clocks. I ended up asking if I could just swap the shrouds and they said yes. The memory on the original card is insane, can do +1600. If I could have the core of that replacement card with the memory of my original card, it would be insane.


----------



## rankftw

Tyler Dalton said:


> Maybe, my problem is I run into a power wall. Interesting story, this is my original 3080, but EVGA sent me a replacement cause I had a defect with my lightbar. The replacement could go higher on the core than my current one, but anything above +800 on memory and it fell on it's face. It also drew 20w or so less power at the same clocks. I ended up asking if I could just swap the shrouds and they said yes. The memory on the original card is insane, can do +1600. If I could have the core of that replacement card with the memory of my original card, it would be insane.


Ask them if you could swap the memory modules too


----------



## ssgwright

my asus tuf is the same core is insane I can game at 2100-2150 no issue but if I overclock the ram past 850 i start getting worse performance


----------



## Tyler Dalton

rankftw said:


> Ask them if you could swap the memory modules too


If only it was that easy lol. The card with the good core had some other issues though, under heavy load during OC'ing the 3rd fan would randomly decide to drop from the 100% I had set, all the way down to auto then it would ramp itself back up to about 85% and get stuck there.


----------



## StreaMRoLLeR

Tyler Dalton said:


> Maybe, my problem is I run into a power wall. Interesting story, this is my original 3080, but EVGA sent me a replacement cause I had a defect with my lightbar. The replacement could go higher on the core than my current one, but anything above +800 on memory and it fell on it's face. It also drew 20w or so less power at the same clocks. I ended up asking if I could just swap the shrouds and they said yes. The memory on the original card is insane, can do +1600. If I could have the core of that replacement card with the memory of my original card, it would be insane.


Have a nice little happy trick for you.

Turn down all RGB on ftw3.

Run fans at %80. At %100 fans draw about 40w and GIMP your power pool.


----------



## blackzaru

mattxx88 said:


> still it might be not enough
> 
> i got mine 3080 TUF shunted perfectly (looking at gpuz values)
> 
> View attachment 2467682
> View attachment 2467684
> 
> 
> 
> but it still get PL cut, dunno what to think at this point


Which resistors did you shunt? the 2 main, the 5 on the 8 pins side? All 6 (the 5 on the 8 pins side and the PCIe)?

And what resistors did you stack on them?


----------



## mattxx88

blackzaru said:


> Which resistors did you shunt? the 2 main, the 5 on the 8 pins side? All 6 (the 5 on the 8 pins side and the PCIe)?
> 
> And what resistors did you stack on them?


all 6 resistors with 3mOhm, welding


----------



## blackzaru

mattxx88 said:


> all 6 resistors with 3mOhm, welding


That might explain it. I'm looking at 5 or 8 mOhm personnally. I do think that your culprit is the PCIe slot power. By putting a 3mOhm resistor on the 5mOhm already there (in parallel), you effectively took the max theorethical power draw of tht slot from 75 to 200W. My guess is: given 3000 series gpus are notorious for trying to "balance" the power in a certain ratio between the power connectors and the pcie slot, if you pcie slot cannot provide the power required (it will never get anywhere near 200W), it will also limit the power drawn from the 8 pins connectors, as the card is trying to keep the same "ratio" of power drawn.

In clearer words, if the slot can only provide 100W, the gpu will see that slot being limited at 50% of it's max power, and will thus apply the same relative limits to the 2 8 pins power connectors you have to keep its "Ratio".

This is just a theory I have, but, if so. Changing your shunt on the pcie slot to a 5mOhm or 8mOhm might solve your problem. (I might be wrong though, as it's only a possibility I'm thinking about.)


----------



## mattxx88

blackzaru said:


> That might explain it. I'm looking at 5 or 8 mOhm personnally. I do think that your culprit is the PCIe slot power. By putting a 3mOhm resistor on the 5mOhm already there (in parallel), you effectively took the max theorethical power draw of tht slot from 75 to 200W. My guess is: given 3000 series gpus are notorious for trying to "balance" the power in a certain ratio between the power connectors and the pcie slot, if you pcie slot cannot provide the power required (it will never get anywhere near 200W), it will also limit the power drawn from the 8 pins connectors, as the card is trying to keep the same "ratio" of power drawn.
> 
> In clearer words, if the slot can only provide 100W, the gpu will see that slot being limited at 50% of it's max power, and will thus apply the same relative limits to the 2 8 pins power connectors you have to keep its "Ratio".
> 
> This is just a theory I have, but, if so. Changing your shunt on the pcie slot to a 5mOhm or 8mOhm might solve your problem. (I might be wrong though, as it's only a possibility I'm thinking about.)


you might be right, i'll try shunting pcie with 8mohm and report back


----------



## blackzaru

mattxx88 said:


> you might be right, i'll try shunting pcie with 8mohm and report back


I'd gladly like to know the result, as I have a Tuf myself, and am setting to shunt mod it within the week. (probably on Sunday)


----------



## smoke2

I would like to ask the owners of Gigabyte Gaming OC model.
How many RPM do your fans have, switch set on Silent mode with stock frequencies during gaming?


----------



## Falkentyne

mattxx88 said:


> you might be right, i'll try shunting pcie with 8mohm and report back


I don't believe your problem is the slot.
Can you please run a Timespy and Port Royal with hwinfo64 open with the TDP Normalized and TDP% values (maximum) both showing, and also gpu-z max values showing? That will require 2 screenshots, 1 for tmespy, 1 for Port royal. I think I know the problem but I need to check.


----------



## mattxx88

Falkentyne said:


> I don't believe your problem is the slot.
> Can you please run a Timespy and Port Royal with hwinfo64 open with the TDP Normalized and TDP% values (maximum) both showing, and also gpu-z max values showing? That will require 2 screenshots, 1 for tmespy, 1 for Port royal. I think I know the problem but I need to check.


no problem, i'll do it
can you anticipate your thought? i'm curious


----------



## outofmyheadyo

3080 master vs FTW3 what would you go for ?


----------



## Falkentyne

mattxx88 said:


> no problem, i'll do it
> can you anticipate your thought? i'm curious


I don't know. I need the complete screenshots with you drawing max power load and the normalized vs TDP% balance.
I suspect something is up with the chip power shunt.


----------



## phillyman36

Got home a little while ago and found Evga sent me a link to purchase the Rtx 3080 ftw3. I really wanted a Asus Strix 3080 but decided to get the FTW3. Should have it Friday 12/11.


----------



## SoldierRBT

outofmyheadyo said:


> 3080 master vs FTW3 what would you go for ?


It depends. If you'd like to keep the stock cooler, I'd say go with the 3080 master extreme (3x 8pins). Stock cooler looks better IMO and you get a cool mini screen. If you're planning watercooling, FTW3 and get any compatible block you can find.


----------



## Stash

arrow0309 said:


> Keep us informed pal, I'm playing @2100 min on both WDL and ACV for the moment with the Trio's stock bios so there's no need to increase any power draw but I'll surely give it a run as well soon, maybe I'll even put her under water then it's a must.


Bit late but this is the most I managed to get on air: https://www.3dmark.com/pr/587035

If you could control the thermals better (i.e. water) then I could see it hitting avg. 2.2GHz comfortably... so, if you're already on avg. 2.1 I'm not sure you'd see much/any benefit on air. Game changer for water, though. Considering a block for 2021.


----------



## c0nsistent

I have to undervolt the crap out of this 320w xc3 to get any kind of score without power throttling. .875v is right where it starts downclocking in GT2


I've got 5 and 8ohm shunts along with a hot glue gun and multimeter ready to go this weekend. Even though I'm on air, this power limit is atrocious. I ran my 2080 Ti @ 400w+ on 2 8pins and I'll do the same with this. If I have to I can just limit the TDP manually via afterburner if it gets too out of control. I think 400w would wake this card up well.


----------



## saar

SoldierRBT said:


> It depends. If you'd like to keep the stock cooler, I'd say go with the 3080 master extreme (3x 8pins). Stock cooler looks better IMO and you get a cool mini screen. If you're planning watercooling, FTW3 and get any compatible block you can find.


3080 master 2x8pins \ xtreme 3x8pins


----------



## saar

hi i need help i got 3080 aorus xtreme 3x8pins out of the box its performe really good , but cant really oc or undervolting for better score\higher freq than out of the box :/
if i will flash strix oc bios or 3090 bios it may help ?
sorry about my english.


----------



## VPII

saar said:


> hi i need help i got 3080 aorus xtreme 3x8pins out of the box its performe really good , but cant really oc or undervolting for better score\higher freq than out of the box :/
> if i will flash strix oc bios or 3090 bios it may help ?
> sorry about my english.


You cannot flash it with the RTX 3090 bios and in all honesty, this is what is called the lottery. I had a Palit RTX 3080 Gamingpro OC which was great, it clocked to do 2145 to 2160mhz core in any game but sold it as the power limit was a pain. Then I got a Gigabyte RTX 3080 Eagle OC was was great for cooling but could barely hold 2070mhz core for benching or gaming after which I released it and got a MSI RTX 3080 Gaming X trio and once again about the same clocks as my previous Palit card but with added power limit and it is running great.

So what I am saying, no matter which RTX 3080 you buy, you will either have a card that can clock a little higher or a lot higher depending on GPU quality.


----------



## Belcebuu

blackzaru said:


> I'd gladly like to know the result, as I have a Tuf myself, and am setting to shunt mod it within the week. (probably on Sunday)


Why are you guys shunting a card that is limited by the 2x8 pins?


----------



## bmgjet

Belcebuu said:


> Why are you guys shunting a card that is limited by the 2x8 pins?


So that its not limited by the 2X8pin lol.


----------



## blackzaru

bmgjet said:


> So that its not limited by the 2X8pin lol.


Couldn't have said it better. hahahahaha


----------



## ZealotKi11er

Still waiting for my 3080 TUF. Was 4th in line 3 weeks ago.


----------



## marashz

Hello all! I'm waiting Evga 3080 XC3 Ultra to be delivered on Monday or Tuesday. I was about orderint any 3080 card if I can and I managed to order this one. After that, I read that it's power limited. I don't care too much about stock cooling, anyway it will be watercooled. I have Ryzen 3900X, and this is cooled by MO-RA3 420 (9x Arctic P14).
Ok, so question. About what difference are we talking in very good OC scenario between let's say Strix and XC3? I play on 5120x1440 (basically 4K), so will it be like 1-3fps, or more like 4-10fps?
I plan to upgrade again to next gen hardware (AM5, DDR5, RDNA3 / Hopper), so this setup will be until 2022 or so.

TL;DR What's difference in perf % between 95% best Strix and XC3? Max OC on water without mods.


----------



## Anth0789

Took me 1 month and 10 days to get my pre order for the RTX 3080 AORUS at my local CC.


----------



## Masayama

saar said:


> hi i need help i got 3080 aorus xtreme 3x8pins out of the box its performe really good , but cant really oc or undervolting for better score\higher freq than out of the box :/
> if i will flash strix oc bios or 3090 bios it may help ?
> sorry about my english.


the aorus xtreme has a power limit of 450w, so there is no need to flash other bios


----------



## lordzed83

There ya go


----------



## lordzed83




----------



## mattxx88

Falkentyne said:


> I don't believe your problem is the slot.
> Can you please run a Timespy and Port Royal with hwinfo64 open with the TDP Normalized and TDP% values (maximum) both showing, and also gpu-z max values showing? That will require 2 screenshots, 1 for tmespy, 1 for Port royal. I think I know the problem but I need to check.


hi Falken, sorry for the delay but ive been busy with work and my new 3090
meanwhile i have put a waterblock on TUF 3080 and this is the tests you asked for




in this tests i overclock with curve 1,000V 2130mhz
now i try rising voltage and frequencies and we see if still cap


EDIT: below further tests with oc 1.050V 2200mhz curve




dunno why even with full shunt, it still drop in certain conditions. this make me think at a deeper power control in these cards that goes above the only resistors


----------



## Falkentyne

mattxx88 said:


> hi Falken, sorry for the delay but ive been busy with work and my new 3090
> meanwhile i have put a waterblock on TUF 3080 and this is the tests you asked for
> 
> 
> 
> 
> in this tests i overclock with curve 1,000V 2130mhz
> now i try rising voltage and frequencies and we see if still cap
> 
> 
> EDIT: below further tests with oc 1.050V 2200mhz curve
> 
> 
> 
> 
> dunno why even with full shunt, it still drop in certain conditions. this make me think at a deeper power control in these cards that goes above the only resistors


Your GPU Chip power shunt is not modded or not modded correctly.


----------



## Garrett1974NL

Not sure why my Inno3D X2 3080 has an "extra" connector... which isn't even used lol... only the 2 PWM fan connectors are used.
It's not for any kind of RGB lighting because the card simply doesn't have that, only the INNO3D GEFORCE RTX lights up which gets its power from 1 of the PWM headers.
(along with 1 of the 2 fans)
Anyone got an idea?
I could just remove the connector and bend the pins till they break off, if I don't then no waterblock would fit.
I don't care about warranty so if it breaks screw it lol


----------



## odin24seven

My new build with a EVGA FTW Ultra Gaming Clocked at 2070Mhz GPU and 19202Mhz Memory.


----------



## mattxx88

mattxx88 said:


> hi Falken, sorry for the delay but ive been busy with work and my new 3090
> meanwhile i have put a waterblock on TUF 3080 and this is the tests you asked for
> 
> 
> 
> 
> in this tests i overclock with curve 1,000V 2130mhz
> now i try rising voltage and frequencies and we see if still cap
> 
> 
> EDIT: below further tests with oc 1.050V 2200mhz curve
> 
> 
> 
> 
> dunno why even with full shunt, it still drop in certain conditions. this make me think at a deeper power control in these cards that goes above the only resistors


got to edit above tests cause Falkentyne was right
i get waterblock down and re-melted all rsistors using flux, this gives a better welding spreading and now gpuz values seems fine



i go on with tests


----------



## blackzaru

mattxx88 said:


> got to edit above tests cause Falkentyne was right
> i get waterblock down and re-melted all rsistors using flux, this gives a better welding spreading and now gpuz values seems fine
> 
> 
> 
> i go on with tests


Glad to see your problem was fixed. Pretty sure you can aim for 2200 on the core now. Don't you think?


----------



## mattxx88

blackzaru said:


> Glad to see your problem was fixed. Pretty sure you can aim for 2200 on the core now. Don't you think?


yea im making some benches now

finding the lower volt first, cloes superpo 1080p extreme 0.9v 2115mhz +1000 memory



now i try again firstrike an port royal 2200mhz 1,050v


----------



## blackzaru

mattxx88 said:


> yea im making some benches now
> 
> finding the lower volt first, cloes superpo 1080p extreme 0.9v 2115mhz +1000 memory
> 
> 
> 
> now i try again firstrike an port royal 2200mhz 1,050v


You might want to OC that CPU of yours mate. Gotta make sure it helps that shunted gpu push as much frames as it can.


----------



## mattxx88

blackzaru said:


> You might want to OC that CPU of yours mate. Gotta make sure it helps that shunted gpu push as much frames as it can.


well, superpo dont report the correct freq, it`s oc`ed 5.3ghz 😅

further test @2230mhz 1.050v


----------



## blackzaru

mattxx88 said:


> well, superpo dont report the correct freq, it`s oc`ed 5.3ghz 😅
> 
> further test @2230mhz 1.050v


Alright, didn't know Superposition failed to show the clock properly, despite labeling it as "actual" (I don't use superposition). I was really dumbfunded at the possibility that someone would be willing to shunt a gpu, but not OC a cpu. hahaha

And, nice clocks (2230). How high can your memory go? Mine seems to tap out (on air, can't wait to get a waterblock to lower temps and maybe improve stability) at +700-750. In Port Royal, I noticed that increments of mem overclocks were scaling with the score a lot better than core overclocking, as if memory was bottlenecking some parts of the test.

EDIT: and, btw, you are a single point away from having the "best" score on port royal, do get that point mate, just do it!!!!


----------



## mattxx88

blackzaru said:


> Alright, didn't know Superposition failed to show the clock properly, despite labeling it as "actual" (I don't use superposition). I was really dumbfunded at the possibility that someone would be willing to shunt a gpu, but not OC a cpu. hahaha
> 
> And, nice clocks (2230). How high can your memory go? Mine seems to tap out (on air, can't wait to get a waterblock to lower temps and maybe improve stability) at +700-750. In Port Royal, I noticed that increments of mem overclocks were scaling with the score a lot better than core overclocking, as if memory was bottlenecking some parts of the test.
> 
> EDIT: and, btw, you are a single point away from having the "best" score on port royal, do get that point mate, just do it!!!!


atm im testin memory at +1000 didnt try more 

where do you see the port royal scores?


----------



## blackzaru

mattxx88 said:


> atm im testin memory at +1000 didnt try more
> 
> where do you see the port royal scores?


It's literally on your test results. The "BEST" score (under your score) is the current top position, in points, for the same CPU+GPU configuration as you have. Alternatively, you can also go here: https://www.3dmark.com/search and sort them manually by either cpu and/or gpu.


----------



## mattxx88

blackzaru said:


> It's literally on your test results. The "BEST" score (under your score) is the current top position, in points, for the same CPU+GPU configuration as you have. Alternatively, you can also go here: https://www.3dmark.com/search and sort them manually by either cpu and/or gpu.


after dinner i`ll resize that ruski


----------



## Koby990

Did anyone tried bios mod a founder edition? Can he share to what bios and results?


----------



## mattxx88

Koby990 said:


> Did anyone tried bios mod a founder edition? Can he share to what bios and results?


You cant flash other BIOS on FE, Just shunt


----------



## marashz

What about XC3? Is there any way to bypass 320W limit (I know in bios it's 366W, bet many says their cards doesn't go over 320W or so)? Is there way to do shunt mod, but in case of RMA remove it without any visible modding steps?


----------



## Koby990

mattxx88 said:


> You cant flash other BIOS on FE, Just shunt


Thanks you very much.
Is that because of the 12 pin connector?


----------



## mattxx88

marashz said:


> What about XC3? Is there any way to bypass 320W limit (I know in bios it's 366W, bet many says their cards doesn't go over 320W or so)? Is there way to do shunt mod, but in case of RMA remove it without any visible modding steps?


I try desoldering 1 resistence and using flux + copper tape (solder removal) It appear ti be brand new after the cleaning (last pass with isoprop alchol)
I dont have any welding skills but flux Is the key, It funnel your welding exactly where you touch with your soldering tool


----------



## mattxx88

Koby990 said:


> Thanks you very much.
> Is that because of the 12 pin connector?


You got It 😔


----------



## StreaMRoLLeR

My golden bin. just passed .975mV 2115-2100 Port Royal. 
This values are at 366W draw at 2100mhz. Stay between 63-66

Giving a "normal" 3080 could have trouble reaching 2100mhz even with 400W+ bios . I am happy


----------



## mattxx88

Streamroller said:


> My golden bin. just passed .975mV 2115-2100 Port Royal.
> This values are at 366W draw at 2100mhz. Stay between 63-66
> 
> Giving a "normal" 3080 could have trouble reaching 2100mhz even with 400W+ bios . I am happy
> View attachment 2468168


this i my low volt score 0.925v 2115mhz



i think almost all 3080 can run this freq/volt

edit

got mine 13k finally, still counting


----------



## blackzaru

mattxx88 said:


> this i my low volt score 0.925v 2115mhz
> 
> 
> 
> i think almost all 3080 can run this freq/volt
> 
> edit
> 
> got mine 13k finally, still counting


Your gpu temps: current: 15, min: 13, max: 27... Let's just say that I doubt that the guy above you, running on air at 63-66, has the same temperature stability as you have, watercooled, and either running a chilled bath, or in a room at a temperature of 10 degrees.

Let's compare apples to apples. You can't expect almost all 3080s to be stable at 2115MHz 0.925V, not with normal cooling.


----------



## SoldierRBT

Here's my undervolt result. Score: 12916. 0.950v locked at 2160MHz. Average clocks 2145-2130MHz. Max temp: 53C Max Board Power Draw: 367W


https://www.3dmark.com/pr/606711


----------



## Belcebuu

blackzaru said:


> Couldn't have said it better. hahahahaha


But how doing the shunt is adding a 1x8 pin? is that much the gain? can you get a 450w bios with it?


----------



## blackzaru

Belcebuu said:


> But how doing the shunt is adding a 1x8 pin?


It's not, shunting allows to pull more power per 8 pins connectors, going beyond the 150W "limit" per connector that is normally in place.


----------



## bmgjet

Belcebuu said:


> But how doing the shunt is adding a 1x8 pin? is that much the gain? can you get a 450w bios with it?


Companys dont like to go over ATX spec since they can be held accountable if some one using super **** PSU burnt there house down.
The offical spec is 66W on slot 12V rail and 150W per 8pin.
Which is why they arnt willing to release bios much higher then 366W for 2 plug cards.

Shunt modding you are tricking the card into thinking its under those limits. If you use a 5 mOhm stacked it will half the reading.
So on 150W real draw the card thinks its only pulling 75W in software readings so is happy to keep pulling more.

The ATX spec is way under speced since it was more aimed as a mini requirement for very very cheap OEM's building computers where the build to min spec to save money on anything they can.
For example the EPS is the same plug and guage wire but its raited to 225W. So theres no reason a decent PSU wouldnt be able to do that over the 8pin plug. Then that would get you to 516W.


----------



## StreaMRoLLeR

blackzaru said:


> Your gpu temps: current: 15, min: 13, max: 27... Let's just say that I doubt that the guy above you, running on air at 63-66, has the same temperature stability as you have, watercooled, and either running a chilled bath, or in a room at a temperature of 10 degrees.
> 
> Let's compare apples to apples. You can't expect almost all 3080s to be stable at 2115MHz 0.925V, not with normal cooling.


Yes. He have extreme h20 with extreme low ambient or ice chilling. Also i didnt see a single 3080 can do this with air cooling ( exeptions can seen in this thread)

I didnt touch on fan rpm or any PT. Its just -280 on core then bumped to 2115 for gaming scenarios

Compare air to air or water to water


----------



## StreaMRoLLeR

SoldierRBT said:


> Here's my undervolt result. Score: 12916. 0.950v locked at 2160MHz. Average clocks 2145-2130MHz. Max temp: 53C Max Board Power Draw: 367W
> 
> 
> https://www.3dmark.com/pr/606711
> 
> 
> View attachment 2468193


Very good friend. But i see you pumped up PT and fan.

I tried mine with real world gaming scenario without any MEM oc too. I think my MEM isnt good as yours/

Locks out system and black screen at 1000 MEM


----------



## WallissonViana

[QUOTE = "mattxx88, postagem: 28686997, membro: 372213"]
isso é minha pontuação de baixo volt 0,925v 2115mhz



Acho que quase todos os 3080 podem executar esta freq / volt

editar

finalmente consegui meus 13k, ainda contando


[/ CITAR]
Por eu ser membro novo não consegui lhe enviar mais mensagem, você poderia nos mandar o gráfico dessa pontuação ?


----------



## mattxx88

WallissonViana said:


> Por eu ser membro novo não consegui lhe enviar mais mensagem, você poderia nos mandar o gráfico dessa pontuação ?


if you don't manadge well with english this can help you: DeepL Translate

anyway i don't get what graph you need


----------



## SoldierRBT

Streamroller said:


> Very good friend. But i see you pumped up PT and fan.
> 
> I tried mine with real world gaming scenario without any MEM oc too. I think my MEM isnt good as yours/
> 
> Locks out system and black screen at 1000 MEM


PT shouldn’t have an impact in my run since it was only drawing 367W max. PT only allows the card to pulls more watts (450W). My memory is good up to +1200 above that performance decreases. Have you tried cooling the backplate? Lowering mem temps may help to stabilize +1000 in your card.

Has anyone got better memory OC with a water block vs air cooling? In heaven Unigine going from +0 to +1200 I get 5fps more (99 to 104fps) but +1300 dropped it to 100fps and +1400 is worst dropping to 90fps. I wonder if these errors are just hot spots in the memory chips.


----------



## StreaMRoLLeR

Slowly ramping up. If i can cool the under before reaching 49C it can do 2220 HOLD. 12.981 with 925 MEM. If i could do 1100 MEM easy 13.100


----------



## blackzaru

SoldierRBT said:


> PT shouldn’t have an impact in my run since it was only drawing 367W max. PT only allows the card to pulls more watts (450W). My memory is good up to +1200 above that performance decreases. Have you tried cooling the backplate? Lowering mem temps may help to stabilize +1000 in your card.
> 
> Has anyone got better memory OC with a water block vs air cooling? In heaven Unigine going from +0 to +1200 I get 5fps more (99 to 104fps) but +1300 dropped it to 100fps and +1400 is worst dropping to 90fps. I wonder if these errors are just hot spots in the memory chips.


The reason your score is lowering is because your ram is unstable. GDDR6X is ECC RAM, it will downclock itself slightly if it starts spitting out errors. (Up to a certain extent.)


----------



## lordzed83

Garrett1974NL said:


> View attachment 2468148
> 
> 
> Not sure why my Inno3D X2 3080 has an "extra" connector... which isn't even used lol... only the 2 PWM fan connectors are used.
> It's not for any kind of RGB lighting because the card simply doesn't have that, only the INNO3D GEFORCE RTX lights up which gets its power from 1 of the PWM headers.
> (along with 1 of the 2 fans)
> Anyone got an idea?
> I could just remove the connector and bend the pins till they break off, if I don't then no waterblock would fit.
> I don't care about warranty so if it breaks screw it lol


thats for top fan i bet


----------



## StreaMRoLLeR

We just need voltage controls from classifield.exe.

I hope at some point evga decide to open the program.

turns out its imposibble due to different VRM controller. ( quote jacob)


----------



## lordzed83

Streamroller said:


> My golden bin. just passed .975mV 2115-2100 Port Royal.
> This values are at 366W draw at 2100mhz. Stay between 63-66
> 
> Giving a "normal" 3080 could have trouble reaching 2100mhz even with 400W+ bios . I am happy
> View attachment 2468168


Think I got one of GOLDEN ones myself. I need Shut mod cause 2,2 is stable just need more juice


----------



## lordzed83

blackzaru said:


> The reason your score is lowering is because your ram is unstable. GDDR6X is ECC RAM, it will downclock itself slightly if it starts spitting out errors. (Up to a certain extent.)


Thats why I did Mod above and 1200 on mems no problem now


----------



## SoldierRBT

blackzaru said:


> The reason your score is lowering is because your ram is unstable. GDDR6X is ECC RAM, it will downclock itself slightly if it starts spitting out errors. (Up to a certain extent.)


Yeah but does lower memory temps help to reduce errors and improve score? Or just voltage mod is the only way to get higher memory OC? I've tried low ambient temp and 120mm fan on the backplate and +1200 still the best It can do.


----------



## lordzed83

@SoldierRBT does reducing GPU temperature help it to maintain stable and error free ??


----------



## blackzaru

SoldierRBT said:


> Yeah but does lower memory temps help to reduce errors and improve score? Or just voltage mod is the only way to get higher memory OC? I've tried low ambient temp and 120mm fan on the backplate and +1200 still the best It can do.


Vram stability is the same as regular RAM stability, it's a combination of: temps, voltage, frequency, and silicone lottery. The lower the temps, the better (in general), voltage, is entirely dependent on the chip and silicone you have, and frequency is entirely up to what you set. You have no control over silicone lottery though, as your Vram's stability is dictated by your "weakest" chip.

Now, about voltage, I have not looked into the specifics of Micron's GDDR6X (yet), but, every ram IC operates the same way: it's a "triangle" of stability between, temperature, voltage, and frequency. Assuming Your temperature and frequency stays the same, there is a range of voltage in which the memory will work correctly, if you are too low, the memory will lack the sufficient voltage to maintain stability, if you are too high, lots of factors will start to affect your stability, such as electron migration (especially in the long term). Temperature acts the same, at a certain voltage and frequency, you need to be at a certain temperature range, too hot: unstable, too cold: cold bugs. Finally: frequency, well, this one is easy, as it works like a CPU, so most people here understand how this part works.


----------



## lordzed83

Run from my FE with block niot checked how low can i keep it stable with new temps needs liquid metal and shut mod anwyay.


----------



## Krisztias

I have installed my Alphacool Block on the FE too, got water temp +8°C on the GPU Core with power maxed out. (idle 28°C, under load 40°C). This is with Thermal Grizzly Hydronaut and with a little bit more thermal pads on the back.
Tomorrow will look what my card can do. Fingers crossed


----------



## StreaMRoLLeR

blackzaru said:


> Vram stability is the same as regular RAM stability, it's a combination of: temps, voltage, frequency, and silicone lottery. The lower the temps, the better (in general), voltage, is entirely dependent on the chip and silicone you have, and frequency is entirely up to what you set. You have no control over silicone lottery though, as your Vram's stability is dictated by your "weakest" chip.
> 
> Now, about voltage, I have not looked into the specifics of Micron's GDDR6X (yet), but, every ram IC operates the same way: it's a "triangle" of stability between, temperature, voltage, and frequency. Assuming Your temperature and frequency stays the same, there is a range of voltage in which the memory will work correctly, if you are too low, the memory will lack the sufficient voltage to maintain stability, if you are too high, lots of factors will start to affect your stability, such as electron migration (especially in the long term). Temperature acts the same, at a certain voltage and frequency, you need to be at a certain temperature range, too hot: unstable, too cold: cold bugs. Finally: frequency, well, this one is easy, as it works like a CPU, so most people here understand how this part works.


Great explanation but FTW3 isnt meant for extreme overclocking (at least for AIR ) like me and SoldierBRT chase. Like going over 1200 MEM. We need classifield.exe for advanced voltage controlling.

No shunt or Bios mod is enough when you reached a certain point and further is impossible without Kingpin tool or hard voltage mods like beardedhardware's


----------



## lordzed83

View attachment 2468281


Run from my FE with block niot checked how low can i keep it stable with new temps needs liquid metal and shut mod anwyay.


Krisztias said:


> I have installed my Alphacool Block on the FE too, got water temp +8°C on the GPU Core with power maxed out. (idle 28°C, under load 40°C). This is with Thermal Grizzly Hydronaut and with a little bit more thermal pads on the back.
> Tomorrow will look what my card can do. Fingers crossed


welcome to alphacool club fantastic block. But they could have gave 2mm pad instead of gluing 2 pads together on that 1 spot lol.


----------



## blackzaru

Streamroller said:


> Great explanation but FTW3 isnt meant for extreme overclocking (at least for AIR ) like me and SoldierBRT chase. Like going over 1200 MEM. We need classifield.exe for advanced voltage controlling.
> 
> No shunt or Bios mod is enough when you reached a certain point and further is impossible without Kingpin tool or hard voltage mods like beardedhardware's


Some cards, such as the ASUS TUF, have through-holes for solutions such as Erlmor's EVC. I know that very few cards have those (you otherwise need to properly connect wires to the exact pinouts of your controller), but, for those who have that option on their card, it could be a way to thinker even more with it.


----------



## ssgwright

I'm still using liquid metal on my shunts... I should probably take it off before my shunts start falling off lol... anyone got a link to some resistors for this? What's the best 5ohm?


----------



## Falkentyne

ssgwright said:


> I'm still using liquid metal on my shunts... I should probably take it off before my shunts start falling off lol... anyone got a link to some resistors for this? What's the best 5ohm?


These should cover what range you need.



https://www.mouser.com/ProductDetail/Panasonic/ERJ-M1WSF5M0U?qs=js9DCdkuA2rWKXij%252BGW9Dg%3D%3D




https://www.mouser.com/ProductDetail/Panasonic/ERJ-M1WSF3M0U?qs=js9DCdkuA2rA4d2%252BtTbTxA%3D%3D




https://www.mouser.com/ProductDetail/Panasonic/ERJ-M1WSF10MU?qs=%2Fha2pyFaduhOx12dbJfB0MsN%2FezM8qu6pargcXxcXkqNR6e9JV9E2g%3D%3D




https://www.mouser.com/ProductDetail/Panasonic/ERJ-M1WSF15MU?qs=%2Fha2pyFaduhOx12dbJfB0K6s6o%252BXGIIw%2FDdm80HvdvgsuqTrPJ4DhA%3D%3D


----------



## ssgwright

Falkentyne said:


> These should cover what range you need.
> 
> 
> 
> https://www.mouser.com/ProductDetail/Panasonic/ERJ-M1WSF5M0U?qs=js9DCdkuA2rWKXij%252BGW9Dg%3D%3D
> 
> 
> 
> 
> https://www.mouser.com/ProductDetail/Panasonic/ERJ-M1WSF3M0U?qs=js9DCdkuA2rA4d2%252BtTbTxA%3D%3D
> 
> 
> 
> 
> https://www.mouser.com/ProductDetail/Panasonic/ERJ-M1WSF10MU?qs=%2Fha2pyFaduhOx12dbJfB0MsN%2FezM8qu6pargcXxcXkqNR6e9JV9E2g%3D%3D
> 
> 
> 
> 
> https://www.mouser.com/ProductDetail/Panasonic/ERJ-M1WSF15MU?qs=%2Fha2pyFaduhOx12dbJfB0K6s6o%252BXGIIw%2FDdm80HvdvgsuqTrPJ4DhA%3D%3D


are these the right size they look a lot more "square" than the shunts on the card?


----------



## ssgwright

thanks just ordered... hopefully my shunts can hold on a couple more days!


----------



## phoenixyz

SoldierRBT said:


> Nope, just with the 450W beta BIOS and 100% fans speed.



what are your gains in percentage terms from stock to OC in games?


----------



## 414347

lordzed83 said:


> Think I got one of GOLDEN ones myself. I need Shut mod cause 2,2 is stable just need more juice


Is that on water? Hmm... not to say its not good, but my 3090 Strix is on air and running Unigine Heaven 4.0 all MAX settings (I know its old benchmark) but I can sustain solid 2110MHz- 2130MHz with fans at 90% and temp never brakes 68C in ambient 22C


----------



## Falkentyne

ssgwright said:


> are these the right size they look a lot more "square" than the shunts on the card?


That image is only for reference.
2512 current sensing shunts are all the same size. 2512 is the size (it's a dimension specification).
A square shunt can't be 2512.

If this helps at all, here's the datasheet for one of them.






ERJM1WSF5M0U - Current Sensing Chip Resistors - Chip Resistors - Industrial Devices & Solutions


Product specifications and documents of ERJM1WSF5M0U, Current Sensing Chip Resistors, Panasonic.




industrial.panasonic.com


----------



## ssgwright

Falkentyne said:


> That image is only for reference.
> 2512 current sensing shunts are all the same size. 2512 is the size (it's a dimension specification).
> A square shunt can't be 2512.
> 
> If this helps at all, here's the datasheet for one of them.
> 
> 
> 
> 
> 
> 
> ERJM1WSF5M0U - Current Sensing Chip Resistors - Chip Resistors - Industrial Devices & Solutions
> 
> 
> Product specifications and documents of ERJM1WSF5M0U, Current Sensing Chip Resistors, Panasonic.
> 
> 
> 
> 
> industrial.panasonic.com


you're the man, thank you!


----------



## lordzed83

NewUser16 said:


> Is that on water? Hmm... not to say its not good, but my 3090 Strix is on air and running Unigine Heaven 4.0 all MAX settings (I know its old benchmark) but I can sustain solid 2110MHz- 2130MHz with fans at 90% and temp never brakes 68C in ambient 22C


ye but i paid 650 quid for this card and 120 for block how much You paid for Strix ?? Just wait till i Shui mod this and put liquid metal on the die  My water is 23-24 when benching whole day just need liquid metal to reduce temps on die. Quite possible ill upgrade thermal pads when i strip card to mot and to change my 10 year old water pump to new D5 i gotten with block.

Thats what I got atm
People I cant buy no GPU's
ZEED









So far 3080 FE for me 3080 Gigabyte Eagle for 650 quid for my mate from Manchester 3080 Palit fot 720 quid to mate in France today 3070 for mate in Poland that wanted NAVI for Xmas and AMD xxxx him and EVERYONE over


----------



## Warrimonk

You guys are absolutely thrashing my 3080 XC3 ultra in port royal. Out of the box I was getting 10,800. As I'm pushing it more and more, I'm getting around 11,400 (no memory tweaking yet).
As soon as I set GPU Above +200 it will attempt to boost past 2200 and crash, but at +200 is barely scrapes past 2000mhz. Temps maxing out at about 70 by end of test with a custom fan curve. 

Time for an undervolt and custom curve I guess.


----------



## omarrana

hello everyone,
i have inno3d rtx 3080 ichill x3 which is a reference board. Its a great graphic card which boost very well to 2100 mhz. Can i flash gigabyte bios to get 370 watt? anyone tried it. thankyou.


----------



## lordzed83

Let me run port for You


----------



## Warrimonk

lordzed83 said:


> Let me run port for You


Lucky man, wish i had gotten the FE. What is the power limit on that thing? 370W? The 3080FE isn't even possible to purchase in Canada because Nvidia stopped selling it, and BestBuy never took over selling them.

My 3080XC3 Ultra feels so power gimped. I am running it at 2005mhz @ .9V and I'm STILL hitting a power limit.


----------



## lordzed83

Warrimonk said:


> Lucky man, wish i had gotten the FE. What is the power limit on that thing? 370W? The 3080FE isn't even possible to purchase in Canada because Nvidia stopped selling it, and BestBuy never took over selling them.
> 
> My 3080XC3 Ultra feels so power gimped. I am running it at 2005mhz @ .9V and I'm STILL hitting a power limit.


ye 370w









Almost managed to grab NAVI for mates.... But AMD is a god damn joke and site kept crashing on payment option
Proff i had em in basket even had 6900xt in basket then


https://bpccdn.fra1.digitaloceanspaces.com/original/3X/2/1/21258e594b98d6afedd3816e2895dfc471724792.png




https://bpccdn.fra1.digitaloceanspaces.com/original/3X/9/0/900cbed92e87637c58f0bb80d92d839840191b1f.png




https://bpccdn.fra1.digitaloceanspaces.com/original/3X/8/5/85d53e3173c17d3e4d9bf5168aaf6c88a9484917.png


Banned for TRYING TO PAY lol


----------



## Awsan

a bunch of 3080s are available on bestbuy

Example:


https://www.bestbuy.com/site/gigabyte-geforce-rtx-3080-10g-gddr6x-pci-express-4-0-graphics-card-black/6430620.p?skuId=6430620





https://www.bestbuy.com/site/gigabyte-geforce-rtx-3080-10g-gddr6x-pci-express-4-0-graphics-card-white/6436219.p?skuId=6436219



Run bois run


----------



## Krisztias

lordzed83 said:


> View attachment 2468281
> 
> 
> Run from my FE with block niot checked how low can i keep it stable with new temps needs liquid metal and shut mod anwyay.
> 
> welcome to alphacool club fantastic block. But they could have gave 2mm pad instead of gluing 2 pads together on that 1 spot lol.


Yes, great block indeed. I think for the price, I can live with the pads in that tiny spot, but true. The EK block looks great, but I can't justify the 2x price (135 vs. 270 eur). It can be better, but not 2x better for sure.


----------



## lordzed83

Krisztias said:


> Yes, great block indeed. I think for the price, I can live with the pads in that tiny spot, but true. The EK block looks great, but I can't justify the 2x price (135 vs. 270 eur). It can be better, but not 2x better for sure.


Ye niot muich better Just ran 40 minutes of MSI combustor +220 on core +1200 on memory no problem temps 40-41c


----------



## rioja

Why there is still no updated info about 3080 Msi Suprim X on the 1st page? It is 370 / 430 power limit,16+4 stages vGPU+vMem


----------



## xermalk

mattxx88 said:


> i think almost all 3080 can run this freq/volt


Not a chance. My TUF Non oc absolutely refuses to go over 1995 stable, no matter what i do to the voltage curve. The second its above 1995 under load it instantly crashes.
My highest score iv managed to get is 11777. And that's not something gaming stable.


----------



## SoldierRBT

Thanks to @Streamroller. Got a new score on air with low ambient temp. 

Graphics Score: 13 401 


https://www.3dmark.com/pr/617704



Locked the voltage to 1.075v and added +255 on the core +1200 on memory, max temp was 55C. I'm still hitting PL on some parts of the run (core dropped a few times for half a second to 2175MHz) but average was around 2250-2235MHz. Even with low ambient (12C) and a 120mm fan on the backplate, memory still decreases performance after +1200. I guess my memory is hitting a voltage wall and not temperature. Purchased a EK FTW3 Block but shipping date says January 6th 2021. I wonder if It'd be able to sustain 2280MHz on water.


----------



## rioja

Which card is better - Aorus Extreme or Suprim X? Both will be watercooled later so noise and temperature on air is no matter


----------



## ssgwright

SoldierRBT said:


> Thanks to @Streamroller. Got a new score on air with low ambient temp.
> 
> Graphics Score: 13 401
> 
> 
> https://www.3dmark.com/pr/617704
> 
> 
> 
> Locked the voltage to 1.075v and added +255 on the core +1200 on memory, max temp was 55C. I'm still hitting PL on some parts of the run (core dropped a few times for half a second to 2175MHz) but average was around 2250-2235MHz. Even with low ambient (12C) and a 120mm fan on the backplate, memory still decreases performance after +1200. I guess my memory is hitting a voltage wall and not temperature. Purchased a EK FTW3 Block but shipping date says January 6th 2021. I wonder if It'd be able to sustain 2280MHz on water.


dude... you need another hobby lol


----------



## StreaMRoLLeR

SoldierRBT said:


> Thanks to @Streamroller. Got a new score on air with low ambient temp.
> 
> Graphics Score: 13 401
> 
> 
> https://www.3dmark.com/pr/617704
> 
> 
> 
> Locked the voltage to 1.075v and added +255 on the core +1200 on memory, max temp was 55C. I'm still hitting PL on some parts of the run (core dropped a few times for half a second to 2175MHz) but average was around 2250-2235MHz. Even with low ambient (12C) and a 120mm fan on the backplate, memory still decreases performance after +1200. I guess my memory is hitting a voltage wall and not temperature. Purchased a EK FTW3 Block but shipping date says January 6th 2021. I wonder if It'd be able to sustain 2280MHz on water.


So proud of you. This is world record for AIR for Ampere series. With water temp matches the ambient i think HOLD temps will be around 20C


----------



## th3illusiveman

So are the people with crappy 3080 bins not posting or did i really get the worst chip in the factory? With my FTW3 Ultra (450W bios) I can't even dream of running 2100Mhz and i can't get 2000Mhz game stable. ironic how the first time i get a flagship GPU (due to market) i get the worst overclocking chip i've ever had. Highest 3DMark score i could get was 18500 graphics score on time spy and the card can't give anymore. Currently running undervolted at 0.881 (1860Mhz)... This stupid card is like having a mustang with a 2 cylinder engine lol. at least its quiet.


----------



## StreaMRoLLeR

th3illusiveman said:


> So are the people with crappy 3080 bins not posting or did i really get the worst chip in the factory? With my FTW3 Ultra (450W bios) I can't even dream of running 2100Mhz and i can't get 2000Mhz game stable. ironic how the first time i get a flagship GPU (due to market) i get the worst overclocking chip i've ever had. Highest 3DMark score i could get was 18500 graphics score on time spy and the card can't give anymore. Currently running undervolted at 0.881 (1860Mhz)... This stupid card is like having a mustang with a 2 cylinder engine lol. at least its quiet.


Pretty much yes but if its ease your pain, I had 2 more ftw3 too and both of them were peaking at 2040 mhz. 

I would say ratio of golden bin. is pretty low on FTW3.

On Strix its much higher. WORST strix i ever seen can do 2085 hold on default (games) I do teamviewer oc and uv for many ppl. 15-16 cards total and i can say your card is not the worst.


----------



## th3illusiveman

Yea, seems EVGA is throwing any and ever chip they find (even they ones they drop lol) on their FTW3's and shipping em out for those margins! Sad but understanable from a $$$ POV. What a waste of a PCB tho...


----------



## acoustic

My FTW3 Ultra is not the greatest bin either. Haven't bothered with undervolting or using curve OC, but I have the 450watt BIOS on it and it's... meh. I run +30 core which gives me around 2025-2040 once the card heats up. I have the worst memory chip(s) on the planet as anything over +500 causes corrections.

Coming from my 2080TI FTW3 Ultra, it's definitely disappointing, but that's silicon lottery. What grinds my gears is that I could have grabbed a STRIX instead (either the last FTW3 Ultra, or the last STRIX at Microcenter), but I went with the FTW3 Ultra because I prefer EVGA, and the FTW3 Ultra for Turing was superior to the STRIX. I figured it would be the same thing this generation, but unfortunately not. I think ASUS has the top card this generation for sure.


----------



## Krisztias

lordzed83 said:


> Ye niot muich better Just ran 40 minutes of MSI combustor +220 on core +1200 on memory no problem temps 40-41c


Thank God I didn' bought the Corsair block (I think it looks better then the Alphacool)






The backplate is only ASTHETIC. Our one is really sturdy piece of metal with a lot of thermal pads for the backside of the card.


----------



## ZealotKi11er

th3illusiveman said:


> So are the people with crappy 3080 bins not posting or did i really get the worst chip in the factory? With my FTW3 Ultra (450W bios) I can't even dream of running 2100Mhz and i can't get 2000Mhz game stable. ironic how the first time i get a flagship GPU (due to market) i get the worst overclocking chip i've ever had. Highest 3DMark score i could get was 18500 graphics score on time spy and the card can't give anymore. Currently running undervolted at 0.881 (1860Mhz)... This stupid card is like having a mustang with a 2 cylinder engine lol. at least its quiet.


Usually, it happens. Also, we might get worse chips now that the reviews are out. What happens if Nvidia figuring to ways to increase yields.


----------



## acoustic

I'm pretty sure this is pretty mediocre/poor. 3080 FTW3 Ultra (450w BIOS), locked @ 1.018v gets me 2040-2055 stable. Once temps go up, clock speed drops of course. Passed about 40 Time Spy Extreme loops, and some passes of Port Royal. Temps dropped a bit. I'm not super concerned about temps or power usage (couldn't care less about power usage besides hitting the limiter).. but yeah, definitely not a great chip. My absolute best Port Royal score was 12639 ; I pulled a 12366 locked @ 1.018v


----------



## blackzaru

th3illusiveman said:


> So are the people with crappy 3080 bins not posting or did i really get the worst chip in the factory? With my FTW3 Ultra (450W bios) I can't even dream of running 2100Mhz and i can't get 2000Mhz game stable. ironic how the first time i get a flagship GPU (due to market) i get the worst overclocking chip i've ever had. Highest 3DMark score i could get was 18500 graphics score on time spy and the card can't give anymore. Currently running undervolted at 0.881 (1860Mhz)... This stupid card is like having a mustang with a 2 cylinder engine lol. at least its quiet.


Those people have very good chips. Although, yours is not great. If 1860MHz stable is the best you can get out of a 450W bios, you really got the bad side of silicone lottery. To give you an idea, my own chip is "average", and everything stock, it pushes and holds at 1940MHz on a 375W (although, it stops around 350W because it's not fully drawing the 75W from the pcie connection) bios.


----------



## rioja

acoustic said:


> Coming from my 2080TI FTW3 Ultra, it's definitely disappointing, but that's silicon lottery. What grinds my gears is that I could have grabbed a STRIX instead (either the last FTW3 Ultra, or the last STRIX at Microcenter), but I went with the FTW3 Ultra because I prefer EVGA, and the FTW3 Ultra for Turing was superior to the STRIX. I figured it would be the same thing this generation, but unfortunately not. I think ASUS has the top card this generation for sure.


But can you still return it within 2-weeks period and try Strix?
I’d like to see whether it still has coil whine issue in newest batches

From what I learned it is

Top line - Strix and Ftw3 (Strix can suffer from coil whine and Ftw3 from poor Oc)
2d line - Aorus Extreme/Suprim X/Palit Gamerock/TUF
all the rest)


----------



## acoustic

I got my card the beginning of October. I'm well past 2 weeks lol


----------



## rioja

I feel I must cancel my Aorus Extreme order and aim to Strix only despite the risk of the coil whine.
Is there real difference between Strix and Strix OC versions or it’s just bios firmware?
Strix OC version - 1935 MHz (Boost Clock)
Strix normal version - 1740 MHz (Boost Clock)


----------



## acoustic

No difference as far as I've seen/read. It's all silicon lottery at that point anyway.


----------



## StreaMRoLLeR

rioja said:


> I feel I must cancel my Aorus Extreme order and aim to Strix only despite the risk of the coil whine.
> Is there real difference between Strix and Strix OC versions or it’s just bios firmware?
> Strix OC version - 1935 MHz (Boost Clock)
> Strix normal version - 1740 MHz (Boost Clock)


Strix OC tend to have better chip as it always been since years.


----------



## th3illusiveman

blackzaru said:


> Those people have very good chips. Although, yours is not great. If 1860MHz stable is the best you can get out of a 450W bios, you really got the bad side of silicone lottery. To give you an idea, my own chip is "average", and everything stock, it pushes and holds at 1940MHz on a 375W (although, it stops around 350W because it's not fully drawing the 75W from the pcie connection) bios.


for clarity I should have noted im running undervolted for the reduced temps/noise. I can get to it run control rtx w/o dlss for hours at 1950Mhz but its like 90+ watts for few % fps...not worth and i think i got about +25Mhz more above that and it's toast lol. Some lucky guys can run 1950Mhz at .875 which is what i was hoping for but it is what it is.


----------



## omarrana

Hello Guys,
I have two cards with me :
Inno3d rtx 3080 ichil x3
Evga rtx 3080 xoc ultra

Inno3d rtx 3080 ichil x3:

This card is amazingly quiet and have great temperatures and never goes above 70'c.
The card can overclock +120mhz +800memory stable.
i get an ok score timespy.
In benchmark and games i have seen him hit 2130 maximum and hovers around 2000-2040

Evga rtx 3080 xoc ultra;
This card can overclock upto +190mhz , +900 memory full fan speed,
stable at +150mhz ,899 memory on air normal fan speed.
However in benchmarks and games i can see the clock speed goes upto 2190 mhz and then comes down.
Some times touches 2215 .
But strange thing is this card have lower timespy score then inno3d and have high temperatures.

I plan to watercool my graphic card later after a year or so. What do yo guy think which one should i keep?
i have 10 days to return one of them.


----------



## blackzaru

th3illusiveman said:


> for clarity I should have noted im running undervolted for the reduced temps/noise. I can get to it run control rtx w/o dlss for hours at 1950Mhz but its like 90+ watts for few % fps...not worth and i think i got about +25Mhz more above that and it's toast lol. Some lucky guys can run 1950Mhz at .875 which is what i was hoping for but it is what it is.


It's quite normal, by undervolting at 0.925V and overclocking a little, I can get mine past 2000MHz stable. The thing is, given you have a high-wattage bios on a triple 8 pins card (ftw3), you got a bit shafted by the silicone lottery. Most cards boost in the 1900s completely stock, and achieve 2000+ MHz on undervolts. It's not dramatic, but, for example, my card (a 375W TUF), will certainly get better results once shunt-modded to a level where power consumption will be the same as your bios can allow. Some people have incredible chips, unfortunately, I have an average one, and you got shafted by EVGA not binning chips and only trying to push their more expensive card at all costs.


----------



## blackzaru

omarrana said:


> Hello Guys,
> I have two cards with me :
> Inno3d rtx 3080 ichil x3
> Evga rtx 3080 xoc ultra
> 
> Inno3d rtx 3080 ichil x3:
> 
> This card is amazingly quiet and have great temperatures and never goes above 70'c.
> The card can overclock +120mhz +800memory stable.
> i get an ok score timespy.
> In benchmark and games i have seen him hit 2130 maximum and hovers around 2000-2040
> 
> Evga rtx 3080 xoc ultra;
> This card can overclock upto +190mhz , +900 memory full fan speed,
> stable at +150mhz ,899 memory on air normal fan speed.
> However in benchmarks and games i can see the clock speed goes upto 2190 mhz and then comes down.
> Some times touches 2215 .
> But strange thing is this card have lower timespy score then inno3d and have high temperatures.
> 
> I plan to watercool my graphic card later after a year or so. What do yo guy think which one should i keep?
> i have 10 days to return one of them.


For the 2 cards: what is the average boost slocks in the benchmarks? Because, if the second card has a higher average boost clock in the benchmark, but has a lower score (by a noticeable margin), then your memory OC might be unstable on the second card. But, first things first: check the average clock of both cards during the benchmark.


----------



## criminal

th3illusiveman said:


> So are the people with crappy 3080 bins not posting or did i really get the worst chip in the factory? With my FTW3 Ultra (450W bios) I can't even dream of running 2100Mhz and i can't get 2000Mhz game stable. ironic how the first time i get a flagship GPU (due to market) i get the worst overclocking chip i've ever had. Highest 3DMark score i could get was 18500 graphics score on time spy and the card can't give anymore. Currently running undervolted at 0.881 (1860Mhz)... This stupid card is like having a mustang with a 2 cylinder engine lol. at least its quiet.


My game stable clock undervolted to .976(I think) is 2010. So yeah sounds like you got a bad case of the silicon lottery.


----------



## omarrana

blackzaru said:


> For the 2 cards: what is the average boost slocks in the benchmarks? Because, if the second card has a higher average boost clock in the benchmark, but has a lower score (by a noticeable margin), then your memory OC might be unstable on the second card. But, first things first: check the average clock of both cards during the benchmark.


Inno3d
clock frequency 2,100 MHz (1,440 MHz)
Average clock frequency 1,950 MHz


EVGA:
Clock frequency2,145 MHz (1,440 MHz)
Average clock frequency1,880 MHz

How can i improve evga average clock? seems like this card has more potential that inno3d but cannot maintain the clocks.

if i run at full speed fans i get 1950 mhz. Seems like evga will outperform if i install waterblock on it.?


----------



## blackzaru

omarrana said:


> Inno3d
> clock frequency 2,100 MHz (1,440 MHz)
> Average clock frequency 1,950 MHz
> 
> 
> EVGA:
> Clock frequency2,145 MHz (1,440 MHz)
> Average clock frequency1,880 MHz
> 
> How can i improve evga average clock? seems like this card has more potential that inno3d but cannot maintain the clocks.
> 
> if i run at full speed fans i get 1950 mhz. Seems like evga will outperform if i install waterblock on it.?


Your frequency curve might be more aggressive on your EVGA than your inno3d, hence the higher peak clock, but the lower overall clock. To get an idea of which one has more "potential", figuring out which one has the highest stable clock at a given voltage (ex: 0.950V) will give you a good idea of which one has the most spare room to clock higher with lower temperatures.


----------



## ssgwright

will these work for the shunt mod?


----------



## Falkentyne

ssgwright said:


> will these work for the shunt mod?
> 
> View attachment 2468841


As long as those are 10 milliohm and not 10 MEGAohms, yes it will work.
Usually milliOhm is written as mOhm, and meagohm is written as Mohm.
But on shunts themselves they can be written in different ways. Where did you get them? Original link, please?


----------



## ssgwright

i see people using 5 milliohm what's the difference between 10 and 5?


----------



## Falkentyne

ssgwright said:


> i see people using 5 milliohm what's the difference between 10 and 5?


The lower the resistance, the lower the reported power draw will be.
Stacking a 5 mOhm shunt on a 5 mOhm original shunt reduces its power reporting by 50%, or in other words, doubles your TDP allowance by 2x (multiply your HWinfo power draw values in "Custom values" by 2, or MSI Afterburner power draw values by " x * 2 " in the Afterburner custom values field). So if GPU-Z reported that you were pulling 400W of board power, the real board power would be 800 Watts.

Stacking a 3 mOhm shunt on top of a 5 mOhm original shunt increases your TDP allowance by 2.67x (not sure what percentage reduction of power reporting that is?).

A 10 mOhm shunt on top of a 5 mOhm shunt is a more modest 1.5x power allowance (x * 1.5 for Afterburner), and 15 mOhm on top of a 5 mOhm is 1.33x.

If you are desoldering and replacing the shunt with a LOWER RESISTANCE SHUNT, for example, desoldering a 5 mOhm and replacing it with a 3 mOhm, is almost exactly the same as stacking an 8 mOhm on top of a 5 mOhm.

Desoldering is always the most accurate method because then you are assured of good contact, but that is also the hardest and most dangerous method (sometimes people have problems melting the original solder to remove the shunt, and you can damage the board with too much heat too). Stacking shunts by soldering requires soldering experience, but as long as you use flux (you MUST use flux) and tin it properly, it's not hard to do as long as you practice on a spare practice board. Or you can just glue some spare shunts to a test bench and then practice your fluxing and soldering skills there. Using MG842AR Silver paint as its own shunt resistor (bridging the original shunts by painting over them fully) is the safest (no heat involved, you can just use Super 33+ tape around the shunts to protect the board from accidentally getting paint on it, etc), but also the least reliable method, as you can have contact issues if the original shunts are NOT fully flat and level. (MSI and Founder's Edition shunts are known to not be flush).

Excellent video on soldering with flux.


----------



## ssgwright

so do you think I'd get better results with 5 mOhm vs 10 mOhm?


----------



## Falkentyne

ssgwright said:


> so do you think I'd get better results with 5 mOhm vs 10 mOhm?


You only stack with 5 mOhm shunts if you are on water cooling. No air cooled card is going to be able to handle 700W of power. You also have to make sure there are no 10 amp fuses on your card.


----------



## ssgwright

I am on water, also what do you think of the wire soldered on the pci-e shunt in the video you posted, it said it fixes the power throttling?


----------



## martin28bln

Hey guys. Which Bios has the max Powertarget for 3080 ref PCB 2x8pin and 3DP+1HDMI working? Somebody told me there should be a 390W Bios available - is that true? THX


----------



## StreaMRoLLeR

martin28bln said:


> Hey guys. Which Bios has the max Powertarget for 3080 ref PCB 2x8pin and 3DP+1HDMI working? Somebody told me there should be a 390W Bios available - is that true? THX


cant go wrong with TUF 370W bios


----------



## martin28bln

With TUF Bios I think or have in mind that not all outputs are working?


----------



## martin28bln

Flashed the AMP Holo Bios to my Gainward Phoenix but still no increase in PL. 335W Peak and card don´t react to changes in Afterburner (Powerslider). Any ideas?


----------



## akkuman

AORUS GeForce RTX™ 3080 XTREME WATERFORCE 10G Key Features | Graphics Card - GIGABYTE Global is this any good ?


----------



## StreaMRoLLeR

martin28bln said:


> Flashed the AMP Holo Bios to my Gainward Phoenix but still no increase in PL. 335W Peak and card don´t react to changes in Afterburner (Powerslider). Any ideas?


amp holo bios is garbage like the card itself. Only working and suitable bios is TUF 370W


----------



## martin28bln

But are all ports working when flashing TUF to refPCB? Any experience from other? I have 4 monitors


----------



## phoenixyz

Good day. I just got an rtx 3080 mis gaming x tri o. I am looking at the strix 450 watt bios .I am interested in unlocking the card with a 450 watt bios for better oc head room. Are they any downsides in doing that like losing a functional display port?


----------



## MikeGR7

phoenixyz said:


> Good day. I just got an rtx 3080 mis gaming x tri o. I am looking at the strix 450 watt bios .I am interested in unlocking the card with a 450 watt bios for better oc head room. Are they any downsides in doing that like losing a functional display port?


No idea about the ports because i only use hdmi.
Only minor downside is that the cooler maxes out at 3000rpm instead of 3200rpm that the fans are capable of.
Still worth it though, the card is way way WAY more stable even using the aircooler ( i use water now but had it on air since launch).


----------



## StreaMRoLLeR

Ok i spent much time with this special core clock binned 3080 ftw3 , Mission complete

Ambient 6C. Delta 5000 RPM Intake MAX Default Cooler temp 47C. Can do 2325 aswell but need tweak and i got cold lol xD

Due to my MEM being garbage, crash at 926 I could only push this far.


----------



## PhoenixMDA

@SoldierRBT 
You need Watercooling you have the better card than i.
Here my TufOC with the EK Cooler, the Alphacool i have sell, because very restrict in flow.
https://www.3dmark.com/spy/16203941


----------



## SoldierRBT

PhoenixMDA said:


> @SoldierRBT
> You need Watercooling you have the better card than i.
> Here my TufOC with the EK Cooler, the Alphacool i have sell, because very restrict in flow.
> https://www.3dmark.com/spy/16203941
> View attachment 2468936


Strong CPU and GPU results! Hopefully I’ll get the EK FTW3 Block next month. Did you shunt mod the card? I gave up on Time Spy because GPU hits Power Limit at just 1.012v and it can’t clock as high as Port Royal


----------



## PhoenixMDA

In Cooling Performance the Alphacool and EK nearly same, but with Ek you can do more water flow.
The Ek Coóler is better, by Alphacool the Pad´s for Mem was 0,5mm to low.

And yes like i have said 6x 20mOhm parallel soldered, that is 25% more Power.With enough Power you can do i think 2235Mhz or more with your Card.
I dont do more Power to my Card it´s only for 24/7 gaming, for that it´s enough.


----------



## SoldierRBT

What’s the power draw of your card in Time Spy? 2200MHz avg clocks is very good.


----------



## phoenixyz

MikeGR7 said:


> No idea about the ports because i only use hdmi.
> Only minor downside is that the cooler maxes out at 3000rpm instead of 3200rpm that the fans are capable of.
> Still worth it though, the card is way way WAY more stable even using the aircooler ( i use water now but had it on air since launch).



Thanks for your feedback. I have never had to flash in a custom bios before. I just saw some people complaining about their middle display port not working after a flash. And what advice can u give in terms of avoiding getting the card bricked because unfortunately the msi gaming x tri o has a single bios. no dual bios like tuf series. Thanks.


----------



## c0nsistent

So considering the XC3 Ultra isn't going over 350W regardless of the power target setting, the 370W TUF BIOS is confirmed to give it a little more?


----------



## Shadowdane

Finally managed to snag a 3080 order!! Newegg finally actually let me get an order in without crashing or instantly going out of stock.


----------



## ssgwright

Falkentyne said:


> The lower the resistance, the lower the reported power draw will be.
> Stacking a 5 mOhm shunt on a 5 mOhm original shunt reduces its power reporting by 50%, or in other words, doubles your TDP allowance by 2x (multiply your HWinfo power draw values in "Custom values" by 2, or MSI Afterburner power draw values by " x * 2 " in the Afterburner custom values field). So if GPU-Z reported that you were pulling 400W of board power, the real board power would be 800 Watts.
> 
> Stacking a 3 mOhm shunt on top of a 5 mOhm original shunt increases your TDP allowance by 2.67x (not sure what percentage reduction of power reporting that is?).
> 
> A 10 mOhm shunt on top of a 5 mOhm shunt is a more modest 1.5x power allowance (x * 1.5 for Afterburner), and 15 mOhm on top of a 5 mOhm is 1.33x.
> 
> If you are desoldering and replacing the shunt with a LOWER RESISTANCE SHUNT, for example, desoldering a 5 mOhm and replacing it with a 3 mOhm, is almost exactly the same as stacking an 8 mOhm on top of a 5 mOhm.
> 
> Desoldering is always the most accurate method because then you are assured of good contact, but that is also the hardest and most dangerous method (sometimes people have problems melting the original solder to remove the shunt, and you can damage the board with too much heat too). Stacking shunts by soldering requires soldering experience, but as long as you use flux (you MUST use flux) and tin it properly, it's not hard to do as long as you practice on a spare practice board. Or you can just glue some spare shunts to a test bench and then practice your fluxing and soldering skills there. Using MG842AR Silver paint as its own shunt resistor (bridging the original shunts by painting over them fully) is the safest (no heat involved, you can just use Super 33+ tape around the shunts to protect the board from accidentally getting paint on it, etc), but also the least reliable method, as you can have contact issues if the original shunts are NOT fully flat and level. (MSI and Founder's Edition shunts are known to not be flush).
> 
> Excellent video on soldering with flux.


in this video they use a wire to shunt the pci-e shunt, does this work?


----------



## Falkentyne

ssgwright said:


> in this video they use a wire to shunt the pci-e shunt, does this work?


It works but I wouldn't suggest that. That wire was acting like a 2 mOhm total shunt resistance.
Some boards won't like a PCIE reporting 15W (even if it's actually 30W real) when the 8 pins are reporting 130W (260W real) and may trigger a power throttle from imbalanced draw.


----------



## EarlZ

I am looking at getting an Asus TUF 3080 / 3090 and the reviews seem to indicate that it as a 'lower' power limit compared to other brands, I did a quick google and signs point that Asus may have released a higher power limit bios, is this the case at the moment ?


----------



## SoldierRBT

Has anyone tried to flash a 2x8pin BIOS into a 3x8pin card? For example 3080 XC3 BIOS to a FTW3 Ultra


----------



## StreaMRoLLeR

SoldierRBT said:


> Has anyone tried to flash a 2x8pin BIOS into a 3x8pin card? For example 3080 XC3 BIOS to a FTW3 Ultra


I saw on reddit and evga forum. Nothing changes.. Shunt is the only way for 2 pin


----------



## StreaMRoLLeR

PhoenixMDA said:


> @SoldierRBT
> You need Watercooling you have the better card than i.
> Here my TufOC with the EK Cooler, the Alphacool i have sell, because very restrict in flow.
> https://www.3dmark.com/spy/16203941
> View attachment 2468936


Hey for be able to hold 2200 you should draw 450W easy with 6x shunts right ? 20 573 is range for top tier 3x8 pin cards. GZ dude


----------



## SoldierRBT

Streamroller said:


> I saw on reddit and evga forum. Nothing changes.. Shunt is the only way for 2 pin


I mean a 2x8pin BIOS into a 3x8pin BIOS on the 3080s. Some 3090 FTW3s owners have reported higher TPD when flashing the 3090 XC3 BIOS because the power readings on the 3rd pin bug out. I was wondering if this also applies to 3080s.


----------



## StreaMRoLLeR

SoldierRBT said:


> I mean a 2x8pin BIOS into a 3x8pin BIOS on the 3080s. Some 3090 FTW3s owners have reported higher TPD when flashing the 3090 XC3 BIOS because the power readings on the 3rd pin bug out. I was wondering if this also applies to 3080s.


oh MB. Interesting one tbh.


----------



## Aurelienbis

omarrana said:


> hello everyone,
> i have inno3d rtx 3080 ichill x3 which is a reference board. Its a great graphic card which boost very well to 2100 mhz. Can i flash gigabyte bios to get 370 watt? anyone tried it. thankyou.


I am also interested.

Actual power limit with Inno3D ichill 3080 x3 is 340W.

Anyone has tested another BIOS with higher limit ?

Edit : with MSI afterburner I never get 340W. Max seems to be capped at 320W with some spikes at 330W.


----------



## PhoenixMDA

EarlZ said:


> I am looking at getting an Asus TUF 3080 / 3090 and the reviews seem to indicate that it as a 'lower' power limit compared to other brands, I did a quick google and signs point that Asus may have released a higher power limit bios, is this the case at the moment ?


No the Powerlimit is low like by other 2Pin Cards, best Case 330-355W there you are in Realität triggern the Powerlimit.
Only Way ShuntMod, i have 20mOhm on.


----------



## Miro75

Falkentyne said:


> The lower the resistance, the lower the reported power draw will be.
> Stacking a 5 mOhm shunt on a 5 mOhm original shunt reduces its power reporting by 50%, or in other words, doubles your TDP allowance by 2x (multiply your HWinfo power draw values in "Custom values" by 2, or MSI Afterburner power draw values by " x * 2 " in the Afterburner custom values field). So if GPU-Z reported that you were pulling 400W of board power, the real board power would be 800 Watts.
> 
> Stacking a 3 mOhm shunt on top of a 5 mOhm original shunt increases your TDP allowance by 2.67x (not sure what percentage reduction of power reporting that is?).
> 
> A 10 mOhm shunt on top of a 5 mOhm shunt is a more modest 1.5x power allowance (x * 1.5 for Afterburner), and 15 mOhm on top of a 5 mOhm is 1.33x.
> 
> If you are desoldering and replacing the shunt with a LOWER RESISTANCE SHUNT, for example, desoldering a 5 mOhm and replacing it with a 3 mOhm, is almost exactly the same as stacking an 8 mOhm on top of a 5 mOhm.
> 
> Desoldering is always the most accurate method because then you are assured of good contact, but that is also the hardest and most dangerous method (sometimes people have problems melting the original solder to remove the shunt, and you can damage the board with too much heat too). Stacking shunts by soldering requires soldering experience, but as long as you use flux (you MUST use flux) and tin it properly, it's not hard to do as long as you practice on a spare practice board. Or you can just glue some spare shunts to a test bench and then practice your fluxing and soldering skills there. Using MG842AR Silver paint as its own shunt resistor (bridging the original shunts by painting over them fully) is the safest (no heat involved, you can just use Super 33+ tape around the shunts to protect the board from accidentally getting paint on it, etc), but also the least reliable method, as you can have contact issues if the original shunts are NOT fully flat and level. (MSI and Founder's Edition shunts are known to not be flush).
> 
> Excellent video on soldering with flux.


Falkentyne, I have spoken with this guy, who is the author of the movie you linked. Interesting: he moded 3 resistors only (2x8pin + PCIe). He didn't touch the other shunts. And it seems to work flawlessly. Does it mean, all of us were wrong when shunting all resistor including SRC, Mem and Chip?


----------



## Falkentyne

Miro75 said:


> Falkentyne, I have spoken with this guy, who is the author of the movie you linked. Interesting: he moded 3 resistors only (2x8pin + PCIe). He didn't touch the other shunts. And it seems to work flawlessly. Does it mean, all of us were wrong when shunting all resistor including SRC, Mem and Chip?


GPU chip power has a power limit. I don't know what that limit is on other cards. On 3090 FE, it is 300W. If it reaches 290W, it will begin to call a power limit. I tested that. Not shunting it will still allow more power just from doing 8 pins and Slot, but it will limit your max power, depending on where its internal power limit is set to. This will also cause your tdp normalized% to be much higher than TDP% in hwinfo64, because normalized will recognize the chip power limit.


----------



## MakubeX

omarrana said:


> hello everyone,
> i have inno3d rtx 3080 ichill x3 which is a reference board. Its a great graphic card which boost very well to 2100 mhz. Can i flash gigabyte bios to get 370 watt? anyone tried it. thankyou.


Not sure about the Gigabyte bios but you can probably flash the Zotac Amp Holo bios without issues which gets you a power limit of 374 W. The Zotac card has a reference board with 2 x 8-pin connectors and 1 HDMI port like yours. I flashed this bios on my PNY RTX 3080 Uprising, which is a base model reference card similar to yours, to get a slight boost in power limit.


----------



## Zeakie

MakubeX said:


> Not sure about the Gigabyte bios but you can probably flash the Zotac Amp Holo bios without issues which gets you a power limit of 374 W. The Zotac card has a reference board with 2 x 8-pin connectors and 1 HDMI port like yours. I flashed this bios on my PNY RTX 3080 Uprising, which is a base model reference card similar to yours, to get a slight boost in power limit.


I flashed the holo bios on my trinity and it gave me the full 375w  cyberpunk easily hits 369w you probably won't get better bios for 2pin


----------



## ZealotKi11er

Moved from 4th slot to 2nd slot in 3 weeks. Progress boys.


----------



## ViTosS

Guys I think there is something weird happening with my card, the boost clock is fluctuating a lot based on the OC I want (even with fixed voltage and fixed boost using voltage/frequency curve in MSI AB), I mean, most of the games the max I can get the boost to stay almost locked is 1980-2010Mhz, if I try anything higher like 2050-2100Mhz even knowing my card can do it fine the fluctuations starts to appear, I mean, standing still in any game without doing anything it varies from 1935-2070Mhz like crazy, I've seen this happening in all games I played (Cyberpunk 2077, AC Valhalla, Horizon Zero Dawn, Shadow of the Tomb Raider, etc), any idea? I tried the EVGA XOC BIOS and the problem still persists, anyone experiencing the same issue?


----------



## phoenixyz

Guys can anyone recommend a decent 450 watts bios that would work with my msi gaming x trio rtx 3080. I have a good chip but am severely power limited. The only worry is that it has one bios and if i flash the wrong bios , it might brick the card. Thanks.


----------



## Falkentyne

phoenixyz said:


> Guys can anyone recommend a decent 450 watts bios that would work with my msi gaming x trio rtx 3080. I have a good chip but am severely power limited. The only worry is that it has one bios and if i flash the wrong bios , it might brick the card. Thanks.


Did you look into shunt modding? That card has fuses, but you can safely stack 10 or 15 mOhm "Current sensing 2512 size" shunts and gain a decent improvement.
I believe MSI, like Founder's Editions, have the edges of the shunts lower than the middle (yuck), so if you solder, make sure you use flux and "tin" the work first, creating a solder joint that the new shunt can rest on, then melt it into a bond.

15 mOhm Shunts: out of stock but: https://www.mouser.com/ProductDetai...12dbJfB0K6s6o%2BXGIIw/Ddm80HvdvgsuqTrPJ4DhA==

These are 10 mOhms, in stock.


https://www.mouser.com/ProductDetail/Panasonic/ERJ-M1WSF10MU?qs=%2Fha2pyFaduhOx12dbJfB0MsN%2FezM8qu6pargcXxcXkqNR6e9JV9E2g%3D%3D



Here's a literally perfect video on this method, along with a written explanation on what he did.






If you're new to soldering but can follow these instructions with flux and solder, please consider buying some Kapton tape to completely cover the area around the shunts! That way if solder drops someplace, it will drop on the tape instead of the PCB 









3M Polyimide Tape, 92, Scotch, 1/4" X 36 Yards, High Temp. - Masking Tape - Amazon.com


3M Polyimide Tape, 92, Scotch, 1/4" X 36 Yards, High Temp. - Masking Tape - Amazon.com



www.amazon.com





You can also, instead of stacking shunts with soldering, use MG842AR silver paint as its own shunt. However it's difficult to get good contact with the paint when the shunt edges are lower than the middle, since you don't want the paint to get on the PCB (this even causes issues with soldering for some people!), so I suggest insulating the areas around the shunts first with Super 33+ tape, which helps a LOT here. If you're going to paint the shunts instead of stacking, take a small flat screwdriver and scrape the conformal coating off the edges of the original shunts carefully.



https://www.amazon.com/gp/product/B01MCXW1Y1/ref=ppx_yo_dt_b_asin_title_o02_s00?ie=UTF8&psc=1



https://www.amazon.com/gp/product/B00004WCCL/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1 (this stuff is VERY useful!!)









GitHub - bmgjet/ShutMod-Calculator: Work out what shunt values to use easily.


Work out what shunt values to use easily. Contribute to bmgjet/ShutMod-Calculator development by creating an account on GitHub.




github.com


----------



## PhoenixMDA

SoldierRBT said:


> What’s the power draw of your card in Time Spy? 2200MHz avg clocks is very good.


I think only 410W AVG and ca. 420W max., the Power Throttle is earlier then the possible max. Power. 
In Furmark the Power is up 445W AVG, i dont know why in some cases you ca do more in other less power.


----------



## MikeGR7

phoenixyz said:


> Thanks for your feedback. I have never had to flash in a custom bios before. I just saw some people complaining about their middle display port not working after a flash. And what advice can u give in terms of avoiding getting the card bricked because unfortunately the msi gaming x tri o has a single bios. no dual bios like tuf series. Thanks.


It is really easy my friend, fear nothing.

Exit every background program and utilities like afterburner - precision X.

Then simply follow the flashing instructions from page 1 of this thread.

I suggest you download the latest (check vbios build date) ASUS STRIX vbios from Techpowerup database.

Flash, restart pc, done.

Bonus: I always make a full driver uninstall using DDU from Guru3D after the flash and then install latest drivers using NVCleanstall.


----------



## phoenixyz

MikeGR7 said:


> It is really easy my friend, fear nothing.
> 
> Exit every background program and utilities like afterburner - precision X.
> 
> Then simply follow the flashing instructions from page 1 of this thread.
> 
> I suggest you download the latest (check vbios build date) ASUS STRIX vbios from Techpowerup database.
> 
> Flash, restart pc, done.
> 
> Bonus: I always make a full driver uninstall using DDU from Guru3D after the flash and then install latest drivers using NVCleanstall.


Thanks i would give it a try. But does it disable stuff like my rgb on my card. Finally the latest nvflash does not have nvflash64 just nvflash. its version 5.667. Thanks


----------



## ssgwright

well this is the best I can get so far... gonna throw the shunts on next weekend: http://www.3dmark.com/pr/635062


----------



## phoenixyz

@MikGR7 it Frigging worked. Gosh this works wonders. my superposition with OC on my old bios jumped from 14816 to 15100 just by moving the power sliders. My stock speed with new bios smashes my old bios which was heavily overclocked .This is crazy. My cards boost to 2010 mhz without OC and holds 2ghz stable. This chip is really good. i would try hitting 2130mhz core tonight. But here is my stock perf so far. Thanks for your help

Old score with old bios with overclock. I only changed my cpu but the scores were the same either way.










New score just by moving power sliders from 100 to 121 no oc base stock clock.


----------



## NDS322

My Zotac RTX 3080 AMP HALO BLACK was burned yesterday. It happens on the back of card at the top right of the main GPU position.

Just only 4 days from receiving from the retailer.


----------



## ViTosS

ViTosS said:


> Guys I think there is something weird happening with my card, the boost clock is fluctuating a lot based on the OC I want (even with fixed voltage and fixed boost using voltage/frequency curve in MSI AB), I mean, most of the games the max I can get the boost to stay almost locked is 1980-2010Mhz, if I try anything higher like 2050-2100Mhz even knowing my card can do it fine the fluctuations starts to appear, I mean, standing still in any game without doing anything it varies from 1935-2070Mhz like crazy, I've seen this happening in all games I played (Cyberpunk 2077, AC Valhalla, Horizon Zero Dawn, Shadow of the Tomb Raider, etc), any idea? I tried the EVGA XOC BIOS and the problem still persists, anyone experiencing the same issue?


Well I figured it out, my GPU for some reason doesn't like anything above 1.025v (1.031v specifically), if I force the voltage above that the crazy fluctuations start to happen, I managed to get 2085Mhz stable at 1.025v as long as I keep it cool under 65c


----------



## Aurelienbis

NDS322 said:


> My Zotac RTX 3080 AMP HALO BLACK was burned yesterday. It happens on the back of card at the top right of the main GPU position.
> 
> Just only 4 days from receiving from the retailer.


Why ? What did you do exactly ?


----------



## marashz

====== EDITED A BIT

Got my 3080 XC3 Ultra. Heating up to 80C, must keep case side open (or optimize case fans). With fans on 100% it's under 65C I think with my best OC.
Best OC I managed so far was about 2070MHz. I think about step-up for FTW3, but maybe when there will be more 3080 in stock.

I'm not sure about step-up, idk if I want to keep XC3, order waterblock and just play games with max stable OC, or step-up, shipping, waiting, again no games...
Water cooling would give me like 10-20W more watts? No fans, no led connected to PCB. Or no? What's best way to check OC potential (without shunt modding, don't want it yet)? Fans on max, undervolt to 0.9V, check max stable core clock, go up with voltage, repeat with core clocks?


----------



## NDS322

Aurelienbis said:


> Why ? What did you do exactly ?


Just normal use guy with PSU EVGA 850W (80+ gold) g3 supernova.

Temp is around only 75 celsius on GPU but I'm not sure about that vrm or that position.


----------



## MikeGR7

phoenixyz said:


> @MikGR7 it Frigging worked. Gosh this works wonders. my superposition with OC on my old bios jumped from 14816 to 15100 just by moving the power sliders. My stock speed with new bios smashes my old bios which was heavily overclocked .This is crazy. My cards boost to 2010 mhz without OC and holds 2ghz stable. This chip is really good. i would try hitting 2130mhz core tonight. But here is my stock perf so far. Thanks for your help
> 
> Old score with old bios with overclock. I only changed my cpu but the scores were the same either way.
> 
> View attachment 2469155
> 
> 
> New score just by moving power sliders from 100 to 121 no oc base stock clock.
> 
> View attachment 2469154


Congratulations for your new scores!

If i remember correctly i could get around 2100 without hitting the power limit.
I bet you will get around that too!

Also if you find the power limit kicking in too early, 450 is not much really, you can also use the VF curve in afterburner and cut back some voltage to gain watts.

MSI Trio is a hidden gem of a card, that 3 pins are rare and much more valuable than say ASUS TuF's vrms. Vrms are sufficient anyway on all cards.

I was sad to see people choose Tuf over Trio... I had 2X TUF and 2XTRIO and the TRIOs consistently beat the Tufs any day in both performance and Temperatures.

So i kissed their metal shrouds goodbye lol

P.S. I see you just joined 5 days ago so welcome and besides the Reply button there is a Rep+ button that you can use to thank people that helped you or the community with their posts. Have fun!


----------



## reflex75

Falkentyne said:


> GPU chip power has a power limit. I don't know what that limit is on other cards. On 3090 FE, it is 300W. If it reaches 290W, it will begin to call a power limit. I tested that. Not shunting it will still allow more power just from doing 8 pins and Slot, but it will limit your max power, depending on where its internal power limit is set to. This will also cause your tdp normalized% to be much higher than TDP% in hwinfo64, because normalized will recognize the chip power limit.


We both have the 3090 FE and I know you know it's 350w default and max power 114% for 400w


----------



## Falkentyne

reflex75 said:


> We both have the 3090 FE and I know you know it's 350w default and max power 114% for 400w


Please read my post again carefully this time....


----------



## reflex75

Falkentyne said:


> Please read my post again carefully this time....


My bad, I read it too fast, sorry🙃
"GPU chip power has a power limit"


----------



## Falkentyne

reflex75 said:


> My bad, I read it too fast, sorry🙃
> "GPU chip power has a power limit"


Yeah. When shunt modding you need to mod that or the chip could reach its power limit before the board reaches its tdp limit (from the 8 pins and slot power total)


----------



## PhoenixMDA

Here the performance of the EK Tuf Cooler
Sensor´s Water IN/OUT are digital Cailtemp, Card is with Shuntmod all *1,25(20mOhm)
It´s commute arround 435W i would like to say.The CPU with 55W is before GPU perhaps up to -0,5degrease of the Delta.

197L max. 447,5W, Delta 14-14,5 C









230L+ max. 454W, Delat 13-13,5C


----------



## iunlock

Hey guys, a little late to the party lol but thought I'd share.

I've been testing and tuning Cyberpunk and it has been absolutely amazing, easily making any hardware beg for mercy with truly maxed out settings w/ Psycho. I've been having a blast tuning the msi* 3080 with Cyberpunk. Gosh I can't wait to get the water block on this card...

On the stock blower w/ 100% fans...

RTX 3080 | 9900K
Cyberpunk 2077 @ 4K
Fully maxed out settings...
DLSS - Quality 
Psycho mode

So far...
2160MHz Core / 10,000MHz+ Mem (using 3DMark metrics)

That seems to be the limit with the stock blower as the game would flat line with anything higher after playing it for a while and/or in a heavy fight scene, but dang I'm not complaining.  

This is enough data for me to really be excited for the water block and on to benching it'll surely go.

The game is incredible and it should also be the new golden standard for benchmarks at the 4K + Fully Maxed Settings. 

For 4K gaming, there is very little difference between Ultra and High so with a customized setting and with the help of DLSS, one could enjoy a nice 4K experience from one of the finest titles to date. 

I'll be doing some more extensive testing today...


----------



## iunlock

Fire Strike w/ the 3080. 

The gap between the 3090 and 3080 fps wise is so slim that for the money it's hard to justify or recommended spending hundreds more for the 3090 to clients. 

In Cyberpunk, this 3080 has already blown past a stock 3090. Even with the 3090 OC'ed I don't see there being a realistic noticeable improvement with the frames already being in a territory that's way beyond good.

With that said...of course I'm still after the 3090/ti/kp lol.



https://www.3dmark.com/fs/23903049











Pretty impressed with the graphics score. Not too shabby.


----------



## PhoenixMDA

@iunlock
This is unplayable, i have arround 50FPS with Dlss Quality 5120x1440 with [email protected] and this is puke limit^^.
I dont know for that this Game need´s such achievement ....


----------



## phoenixyz

iunlock said:


> Fire Strike w/ the 3080.
> 
> The gap between the 3090 and 3080 fps wise is so slim that for the money it's hard to justify or recommended spending hundreds more for the 3090 to clients.
> 
> In Cyberpunk, this 3080 has already blown past a stock 3090. Even with the 3090 OC'ed I don't see there being a realistic noticeable improvement with the frames already being in a territory that's way beyond good.
> 
> With that said...of course I'm still after the 3090/ti/kp lol.
> 
> 
> 
> https://www.3dmark.com/fs/23903049
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Pretty impressed with the graphics score. Not too shabby.


Pretty impressive scores. what voltage are u running your gpu at? Thanks


----------



## iunlock

Cyberpunk definitely doesn't need to be maxed out. A good combo of settings for 1440p is to set it at Ultra presets, Ray tracing (reflections only) and with DLSS on quality. Try that out and see now works for you.

I'm playing on a LG 38" wide screen @ 160Hz. Both at 1440p and 4K.


----------



## phoenixyz

iunlock said:


> Cyberpunk definitely doesn't need to be maxed out. A good combo of settings for 1440p is to set it at Ultra presets, Ray tracing (reflections only) and with DLSS on quality. Try that out and see now works for you.
> 
> I'm playing on a LG 38" wide screen @ 160Hz. Both at 1440p and 4K.
> 
> Sent from my SM-N986U1 using Tapatalk


Are u running ur gpu on water. The temps are pretty good for the sustained clocks?


----------



## iunlock

phoenixyz said:


> Pretty impressive scores. what voltage are u running your gpu at? Thanks


That's with the stock vbios and the max voltage I've seen is 1.081v... I've been deciding which firmware to flash on there as the stock wattage caps out at 320W.

I was thinking maybe the zotac vbios?











phoenixyz said:


> Are u running ur gpu on water. The temps are pretty good for the sustained clocks?


Surprisingly it's on the stock blower @ 100% max fans. I've been scratching my head too.,really impressed.


----------



## phoenixyz

iunlock said:


> That's with the stock vbios and the max voltage I've seen is 1.081v... I've been deciding which firmware to flash on there as the stock wattage caps out at 320W.
> 
> I was thinking maybe the zotac vbios?


Damn that is pretty impressive for a stock bios. I flashed in a strix 3080 bios and i gave it more headroom. i can push my card to 2150mhz on the core but the issue is the temps when the temps climbs past 70c the frequency drops. I think i need to tinker with my voltages to give my card more thermal head room. its an msi gaming x tri o btw.


----------



## iunlock

phoenixyz said:


> Damn that is pretty impressive for a stock bios. I flashed in a strix 3080 bios and i gave it more headroom. i can push my card to 2150mhz on the core but the issue is the temps when the temps climbs past 70c the frequency drops. I think i need to tinker with my voltages to give my card more thermal head room. its an msi gaming x tri o btw.


Very nice. I've been pretty impressed with MSi cards lately. As an owner of primarily all evga cards, the ratio of landing a good bin with msi has been on par with evga ftw3 cards...


----------



## phoenixyz

iunlock said:


> Very nice. I've been pretty impressed with MSi cards lately. As an owner of primarily all evga cards, the ratio of landing a good bin with msi has been on par with evga ftw3 cards...


Yeah the chip is awesome. Your temps on the other hand is magical. I thought u were on a custom loop. My temps are in the mid to high 70s and my clocks go down. The chip can hit 2.2 ghz if i solve the temp issues. I am considering putting it under water later.


----------



## iunlock

phoenixyz said:


> Yeah the chip is awesome. Your temps on the other hand is magical. I thought u were on a custom loop. My temps are in the mid to high 70s and my clocks go down. The chip can hit 2.2 ghz if i solve the temp issues. I am considering putting it under water later.


I have a water block and a back plate on the desk next to me just staring at me lol. My main gaming rig is a full custom loop...can't wait to get this card on water, but I've been enjoying the game so much that I'm just driving it as is for now.

Super nice cards indeed.


----------



## PhoenixMDA

iunlock said:


> Cyberpunk definitely doesn't need to be maxed out. A good combo of settings for 1440p is to set it at Ultra presets, Ray tracing (reflections only) and with DLSS on quality. Try that out and see now works for you.
> 
> I'm playing on a LG 38" wide screen @ 160Hz. Both at 1440p and 4K.


I let it so, 50-60FPS ist playable, but i do Chromatic Aberration off, it make's the sides unsharp.

I musst say DLLS 2.0 is much better than the 1.0.


----------



## BluemoonRisen

Zeakie said:


> I flashed the holo bios on my trinity and it gave me the full 375w  cyberpunk easily hits 369w you probably won't get better bios for 2pin


Do you have the 3080 Trinity or the OC ?

Because mine (Trinity) won´t go over 330W with the AMP BIOS.


----------



## StreaMRoLLeR

PhoenixMDA said:


> @iunlock
> This is unplayable, i have arround 50FPS with Dlss Quality 5120x1440 with [email protected] and this is puke limit^^.
> I dont know for that this Game need´s such achievement ....
> View attachment 2469271


Use DLSS Balanced for 5120x1440. We have the same monitor


----------



## phoenixyz

MikeGR7 said:


> Congratulations for your new scores!
> 
> If i remember correctly i could get around 2100 without hitting the power limit.
> I bet you will get around that too!
> 
> Also if you find the power limit kicking in too early, 450 is not much really, you can also use the VF curve in afterburner and cut back some voltage to gain watts.
> 
> MSI Trio is a hidden gem of a card, that 3 pins are rare and much more valuable than say ASUS TuF's vrms. Vrms are sufficient anyway on all cards.
> 
> I was sad to see people choose Tuf over Trio... I had 2X TUF and 2XTRIO and the TRIOs consistently beat the Tufs any day in both performance and Temperatures.
> 
> So i kissed their metal shrouds goodbye lol
> 
> P.S. I see you just joined 5 days ago so welcome and besides the Reply button there is a Rep+ button that you can use to thank people that helped you or the community with their posts. Have fun!



Hey bro. Big thanks for your tips. so far i am impressed with the new bios. My only challenges is temps. My chip is good i can hit 2.2ghz stable but temps shoots up to the mid 70s. I am trying to get a decent VF curve to perf ratio. What are ur voltage settings. Can u please share a screenshot. i am hovering at 2085 to 2100 due to thermal limits. Thanks.


----------



## iunlock

PhoenixMDA said:


> I let it so, 50-60FPS ist playable, but i do Chromatic Aberration off, it make's the sides unsharp.
> 
> I musst say DLLS 2.0 is much better than the 1.0.





phoenixyz said:


> Hey bro. Big thanks for your tips. so far i am impressed with the new bios. My only challenges is temps. My chip is good i can hit 2.2ghz stable but temps shoots up to the mid 70s. I am trying to get a decent VF curve to perf ratio. What are ur voltage settings. Can u please share a screenshot. i am hovering at 2085 to 2100 due to thermal limits. Thanks.


Figured you guys would like this lol. For us Cyberpunk 2077 players who understand what it's really like.........






Pretty much sums it up. [emoji14]


----------



## ViTosS

Guys is it normal behavior of the 3080 OC when we do voltage curve/frequency OC and throw too much voltage the card starts to fluctuate the boost clock by a lot? Like my 3080 whenever I try higher clocks and boost voltage to more than 1.031v I don't have that locked boost anymore, what is this related to?


----------



## StreaMRoLLeR

ViTosS said:


> Guys is it normal behavior of the 3080 OC when we do voltage curve/frequency OC and throw too much voltage the card starts to fluctuate the boost clock by a lot? Like my 3080 whenever I try higher clocks and boost voltage to more than 1.031v I don't have that locked boost anymore, what is this related to?


Increase PT limit aswell. Your card could hit PT limit and do a " boost cut". In a succesfull VF OC there should not be a drop. ( removing temp factor)


----------



## StreaMRoLLeR

phoenixyz said:


> Hey bro. Big thanks for your tips. so far i am impressed with the new bios. My only challenges is temps. My chip is good i can hit 2.2ghz stable but temps shoots up to the mid 70s. I am trying to get a decent VF curve to perf ratio. What are ur voltage settings. Can u please share a screenshot. i am hovering at 2085 to 2100 due to thermal limits. Thanks.


Every card's behaviour is different bro. You must waste some time to find ur chip setting. 

I can give you a tip tho  1.050mV 1.075mV play around these


----------



## OmegaRED.

Is there an increased power limit bios for the 3080 TUF OC? From what Ive read here all the 3 pin bios (Strix / FTW Ultra) wont work on the TUF. My card wont even stay at 2000mhz even though temps are super low.

I have tried every combination of power limit, voltage and fan profile but I can't get it to stay over 1935mhz most of the time. _Feels _like a power limitation.


----------



## devilhead

which bios is best to use just with 2x8pin shunts? have ref. 3080 2x8pin
tested TUF OC 3080, it's same as stock bios, looks like getting fast to the pcie slot power limits ~65w


----------



## cennis

Zeakie said:


> I flashed the holo bios on my trinity and it gave me the full 375w  cyberpunk easily hits 369w you probably won't get better bios for 2pin


Is this a trinity or trinity OC?


----------



## devilhead

.


----------



## benbenkr

Hi guys, just want to ask is if there is a point in OCing VRAM on the 3080? I have it at +500mhz now but it seems to be.... well, adding nothing to performance other than superfluous points in benchmarks?


----------



## cstkl1

benbenkr said:


> Hi guys, just want to ask is if there is a point in OCing VRAM on the 3080? I have it at +500mhz now but it seems to be.... well, adding nothing to performance other than superfluous points in benchmarks?


+1000 . if u can run at 21.. keep it at that. gddr6x was originally designed at 21 .. the 19 variant etc was due to operating temp, power delivery conditions etc.


----------



## akkuman

Some Tips 4 the AORUS GeForce RTX™ 3080 XTREME WATERFORCE 10G ?


----------



## BluePaint

Cooling the caps on the back of the card with a high RPM fan really helps me keeping +2150 Mhz in Cyberpunk. With less fan cooling like I had before, it will crash occasionally even < 2100.

SoldierRBT mentioned before that cap cooling can make a difference of 15-30 Mhz for benchmarks. In my case for gaming it seems to make a difference of almost 100Mhz.

That makes we cant to cap mod my MSI gaming because it only has a single MLCC cap. The higher end MSI Suprim X, which uses exactly the same PCB, uses 4 MLCCs.

Does anyone have the exact specs (so that I can find it in an online shop) for MLCCs which can be used for that? I am not really familiar with those things and I had trouble identifying the exact type.


----------



## blurp

I have the chance to get either a Asus TUF 3080 or a EVGA 3080 FTW3 Ultra. 
I plan to water cool and OC moderately. 
Guys what would you get ? 
Side question : water block available for the FTW3 ultra ? I know EKWB has one planned for the end of January. 
Thanks!


----------



## BluePaint

TUF is power limited. Without shunts, benefit of watercooling would be less than with a FTW which can use 100W more than the TUF with a BIOS update.


----------



## marashz

FTW3 will get waterblock from Watercool. Heatkiller V.


__ https://twitter.com/i/web/status/1334898173647736838


----------



## ZealotKi11er




----------



## phoenixyz

BluePaint said:


> TUF is power limited. Without shunts, benefit of watercooling would be less than with a FTW which can use 100W more than the TUF with a BIOS update.


Are u on liquid cooling for your gpu. How do u keep temps down at high clocks. I am using a strix bios on my msi gaming x trio and it hits 2.2ghz but the temps rises so fast and clocks drop. Maybe water might be the only way out.


----------



## BluePaint

@phoenixyz
Well, dropping clocks @2200 Mhz is probably unavoidable, even on water when time goes on and the water warms up. 

I am still on air. I removed the fans with the shroud and replaced with 2x140mm 3000RPM Noctua fans (only slightly better than stock). I have no case yet and the whole thing sits on the windowsill. For benching I open the window completely and for high frequ gaming I open them slightly for some fresh air . When gaming I am wearing headphones which helps against the fan noise.

Water would be more efficient but I really want to get rid of the card as soon as the 3080Ti hits the shops (can take a while, yes) so I don't want to modify it even more atm.


----------



## Garrett1974NL

Those of you who watercooled their card, what are your deltas?
I have a D5 pump at 5 (full speed) with 2 420 rads and 6 140mm fans around 750 rpm.
When under 99% load, for example in DayZ or other games, I have 14-15C delta, so if my water is 35, the GPU temp will be 49 to 50, again that is with 99% load.
I had a 2080Ti before this, with a Heatkiller IV fullcover on it, but that had liquid metal so I had a delta of 7 or 8C, which was really good, this time I went with EK's Ectotherm.
I have their Reference Edition block on the 3080.


----------



## StreaMRoLLeR

phoenixyz said:


> Are u on liquid cooling for your gpu. How do u keep temps down at high clocks. I am using a strix bios on my msi gaming x trio and it hits 2.2ghz but the temps rises so fast and clocks drop. Maybe water might be the only way out.


What is your method of hitting 2200 mhz ? If you are doing this with Power Limited ( 1 all the time in msi AB) thats why clock drops. Do it with VF OC and lock voltage.


----------



## phoenixyz

Streamroller said:


> What is your method of hitting 2200 mhz ? If you are doing this with Power Limited ( 1 all the time in msi AB) thats why clock drops. Do it with VF OC and lock voltage.


I would try vf oc. just need to strike the balance between right voltage and clocks so i can clock high and hold good temps . My bios is a strix 450 watts.


----------



## man from atlantis

Woot just ordered a Palit Gamerock I'm not fan of the looks and have a non RGB no BS build but hell, it's the cheapest RTX 3080 I could find.

It does have 21 power stages, you can update the OP.










Thanks to seperate VRAM, VRM heatsink it's cooler than the rest, while GPU Die is a few degree hotter than TUF.


----------



## Micko

Am I right to assume when one is water cooling a power limited card such as two 8 pin 3080, card does not only have more power available for gpu because stock fans are disconnected, but the chip itself is drawing less power because it is operating at 20c+ lower temperature.

How much additional power could a gpu itself draw that way ? Does 10-20w from lack of fans and another 20w from lower temps sound reasonable ?


----------



## lmfodor

Hi! Anyone try the new BIOS for the ASUS TUF OC? I just installed a few hours ago. No issues. I will try how it performs witht my old OC settings..

This is the notes about the new version
-Further optimize the performance for 0dB fan feature
-Fixed motherboard “beeping” bug during computer start-up


----------



## iunlock

*DCS World (Flight Sim) @ 3840x1600 (LG 38" Curved Wide Screen)*

RTX 3080 VRAM Usage: Maxed..









The 10GB is NOT enough for 1440p+/4K gaming/flying at Maxed Out Settings, in certain demanding games or flight sims.

For 1080p gamers, the 3080 is more than adequate.

System RAM Usage:









@BluePaint, because the 3080 in inadequate for my usage, I'm also jumping up as I have the 3090 in route; however, the 3080Ti would also be perfectly fine too, which I wouldn't mind. I'll be putting the 3090 on water for sure as I already have the block...


----------



## rankftw

Is it safe to flash the Amp Holo BIOS to a Palit Gaming Pro? I noticed the Amp Holo has a custom board and the Palit has a reference board, does this matter?


----------



## NavaGabe

Any BIOS recommendations for the Ventus 3X OC, i feel like the card is limited by the power limit because at full load i stay under 50c with 100% fan speed.


----------



## ssgwright

oh wow there is a new TUF bios out


----------



## ssgwright

wow I'm pulling about 50 more watts from this new tuf bios, can anyone else confirm? I was barely hitting 12,700 on the last tuf bios now without even trying I just hit 12,800!!!









I scored 12 816 in Port Royal


Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## SoldierRBT

ssgwright said:


> wow I'm pulling about 50 more watts from this new tuf bios, can anyone else confirm? I was barely hitting 12,700 on the last tuf bios now without even trying I just hit 12,800!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 816 in Port Royal
> 
> 
> Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


That's awesome for TUF owners. What's the TDP in the new BIOS. You can check this on GPU-Z.


----------



## Warrimonk

ssgwright said:


> wow I'm pulling about 50 more watts from this new tuf bios, can anyone else confirm? I was barely hitting 12,700 on the last tuf bios now without even trying I just hit 12,800!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 816 in Port Royal
> 
> 
> Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


50 more watts... so does that mean you are pulling 370W now? I want to try this on my XC3U since I'm hitting the 340W limit super easily. It's even holding back my memory OC.


----------



## Micko

ssgwright said:


> wow I'm pulling about 50 more watts from this new tuf bios, can anyone else confirm? I was barely hitting 12,700 on the last tuf bios now without even trying I just hit 12,800!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 816 in Port Royal
> 
> 
> Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


Could you please screenshot a GPUZ or Afterburner after a Port Royal run ? BTW is your TUF modded in any way ? 2160MHZ average clock during Port Royal is insanely good for TUF, you are probably in a top 1% of 3080 TUF owners.


----------



## Warrimonk

Micko said:


> Could you please screenshot a GPUZ or Afterburner after a Port Royal run ? BTW is your TUF modded in any way ? 2160MHZ average clock during Port Royal is insanely good for TUF, you are probably in a top 1% of 3080 TUF owners.


No kidding my 3080 XC3 can do 2050 mhz at .9V and +650 memory, yet thanks to the power limit I cant even go over 11750 in Port Royal. The only other partner 2-pin cards ive seen past 12,000 PR is the 3080 FE and the 3080 Aorus Master (non extreme), both cards limited to 370W


----------



## f0leyme1ster

NavaGabe said:


> Any BIOS recommendations for the Ventus 3X OC, i feel like the card is limited by the power limit because at full load i stay under 50c with 100% fan speed.


Seconded, got the same card and the power limit is getting annoying


----------



## ssgwright

it is modded and on water. I have liquid metal on all the shunts (just got 5ohm shunts and flux in yesterday) so I'm going to get these shunts soldered before the liquid metal makes them fall off lol

So I can't really give you guys an accurate gpu-z screen shot with the shunts modded, sorry. I can say old bios gpu-z pull during PR was 250w now it's close to 300w with this new bios


----------



## MakubeX

iunlock said:


> *DCS World (Flight Sim) @ 3840x1600 (LG 38" Curved Wide Screen)*
> 
> RTX 3080 VRAM Usage: Maxed..
> View attachment 2469805
> 
> 
> The 10GB is NOT enough for 1440p+/4K gaming/flying at Maxed Out Settings, in certain demanding games or flight sims.
> 
> For 1080p gamers, the 3080 is more than adequate.
> 
> System RAM Usage:
> View attachment 2469804
> 
> 
> @BluePaint, because the 3080 in inadequate for my usage, I'm also jumping up as I have the 3090 in route; however, the 3080Ti would also be perfectly fine too, which I wouldn't mind. I'll be putting the 3090 on water for sure as I already have the block...


Memory allocation doesn't necessarily mean the game needs that much memory. Some games allocate memory according to how much VRAM there is available to them even if say they're only going to actually use half of it for example. Now, I don't know how much Flight Sim actually uses but just be aware that VRAM allocation doesn't necessarily equal VRAM required.


----------



## iunlock

MakubeX said:


> Memory allocation doesn't necessarily mean the game needs that much memory. Some games allocate memory according to how much VRAM there is available to them even if say they're only going to actually use half of it for example. Now, I don't know how much Flight Sim actually uses but just be aware that VRAM allocation doesn't necessarily equal VRAM required.


Correct. Instead of using both memory usage data output in the OSD from hwinfo64 and AB, I just prefer to use the AB sensor more for organization purposes on the OSD.

Therefore...

I've exited the game and restarted so that you can see the transitions, but the *graph max is set to 10,000MB.*









*99.2% Memory Usage*: (This is not allocation. The GPU VRAM is getting taxed to the very edge.)









Here you can clearly see the *GPU DEDICATED VRAM* getting taxed: (Again, this is NOT allocation.)










For *System RAM* data:









The 10GB on the 3080 isn't enough for me. It's a great card and I think value wise the 3080 and 3080Ti will be the best two choices, but I think ultimately the 3080Ti will be the perfect balance of having the best of all Worlds this gen.

I have a 3090 in route, which wouldn't be my first pick, but the 3080Ti doesn't exist right now so oh well..


----------



## MakubeX

iunlock said:


> Correct. Instead of using both memory usage data output in the OSD from hwinfo64 and AB, I just prefer to use the AB sensor, which I have displayed on my OSD.
> 
> Therefore...
> 
> I've exited the game and restarted so that you can see the transitions, but the *graph max is set to 10,000MB.*
> View attachment 2469942
> 
> 
> *99.2% Memory Usage*: (This is not allocation. The GPU VRAM is getting taxed to the very edge.)
> View attachment 2469943
> 
> 
> Here you can clearly see the *GPU DEDICATED VRAM* getting taxed: (Again, this is NOT allocation.)
> View attachment 2469944
> 
> 
> 
> For *System RAM* data:
> View attachment 2469945
> 
> 
> The 10GB on the 3080 isn't enough for me. It's a great card and I think value wise the 3080 and 3080Ti will be the best two choices, but I think ultimately the 3080Ti will be the perfect balance of having the best of all Worlds this gen.
> 
> I have a 3090 in route, which wouldn't be my first pick, but the 3080Ti doesn't exist right now so oh well..


Ooph, yeah in that case that's certainly not just allocation. MS Flight Sim is certainly an outlier among games in terms of resource consumption. Then again, there's nothing else like it.

I'm curious to see where it maxes out in VRAM usage on the 3090 when you get it.


----------



## iunlock

MakubeX said:


> Ooph, yeah in that case that's certainly not just allocation. MS Flight Sim is certainly an outlier among games in terms of resource consumption. Then again, there's nothing else like it.
> 
> I'm curious to see where it maxes out in VRAM usage on the 3090 when you get it.


I'll surely keep you guys posted. MSFS is pretty demanding, but DCS has been more demanding from my testing. It's nuts.


----------



## MakubeX

iunlock said:


> I'll surely keep you guys posted. MSFS is pretty demanding, but DCS has been more demanding from my testing. It's nuts.


I was just fixing my post when you beat me to it, haha. My bad, you did say DCS World. How is it compared to MS Flight Sim?


----------



## iunlock

All good.  Funny enough it was relevant as I had tested MSFS the other day too...

MSFS at 4K with max settings was pretty on par to a stock 3090 fps wise. I'll have to dig up the data...


----------



## DStealth

To conclude whether 10gb are enough or not...or just allocated or even used. You have to use different approach.
Intall GPU Ram Drive from here https://github.com/prsyahmi/GpuRamDrive
Than start increment let say from 256MB then 512, 768, 1024 and so on. Then meassure minimum FPS when start dipping them below normal values than your Video ram is not enoght. Let us know the results please.
IMO this will be near 6 or 7gb(left available) as 100% 3070 with 8gb wouldn't have drops.


----------



## iunlock

I have no doubts that the amount of vram is inadequate for my needs. Stuttering, flickering etc... It's not a matter of wondering if the vram is the culprit, because it is without question.

I'll look into trying out the test. Thanks.


----------



## PhoenixMDA

SoldierRBT said:


> That's awesome for TUF owners. What's the TDP in the new BIOS. You can check this on GPU-Z.


Holy Crap 377,1W are with my 20mOhm *471,4W....* 
That is in this Resolution much more than before.


----------



## elox

I must have missed it somewhere but is there any way to increase core clock while lowering voltage (msi curve editor) AND having +9% powertarget? 
109% power target working fine when using Gainward Expert Tool but as soon as i edit curve in MSI the power limit is back to 100%.
Also, I heard ASUS Tuf Bios has 370w and should work on a Gainward Phoenix RTX 3080? Anyone tried it yet?


----------



## Anaximander

Hi everyone, anyone having a BIOS recommendation for Palit 3080? Can't manage to get it over the 340W limit


----------



## man from atlantis

My card has just arrived, looking forward to work day is off and get my hands on her.


----------



## iunlock

*Main Gaming Desktop / 9900KS @ 53x / RTX 3080 Stock Blower *

The best thing about doing these fun runs on the gaming desktop is that it's not even on my test bench yet. It's pretty fun, because you know it can do more. I'm pretty impressed with this 3080 and how close it is to the 3090 in real world gaming scenarios. In a lot of instances with an OC it already matched or beat out a stock 3090. The only thing lacking for me is the 10GB vram, which will make the 3080Ti a nice sweet spot. 











https://www.3dmark.com/fs/24372637


----------



## elox

Anaximander said:


> Hi everyone, anyone having a BIOS recommendation for Palit 3080? Can't manage to get it over the 340W limit


Palit and Gainward are same company and both cards have 2x8 and reference PCB so guess we are looking for the same. I heard ASUS TUF 3080 bios should work and give you 370w. Have not tested it yet.


----------



## ZealotKi11er

I am trying to figure out why people are so upsest with pushing 320W card to these crazy level. The gains seem minimal to me. I know its OCN but dam these cards suck power. Also stuff that 3080 cant cope with, 10% more perf will not help it.


----------



## rankftw

So I flashed the Amp Holo BIOS to my Palit Gaming Pro and although it said that the power limit was now increased to 374w it still hit the performance cap at around 340w. Think the card may be hardware locked at that.


----------



## MrBridgeSix

NVFlash 5.670 update is just for 3060 ti support?


----------



## Vapochilled

Can someone post the new TUF OC Bios here ?


----------



## PhoenixMDA

[


Vapochilled said:


> Can someone post the new TUF OC Bios here ?


TUF-RTX3080-O10G-GAMING ｜ Grafikkarten ｜ Mainboards / Komponenten ｜ ASUS Deutschland


----------



## NavaGabe

f0leyme1ster said:


> Seconded, got the same card and the power limit is getting annoying


i flashed the TUF OC bios and it does allow you to slide the power limit to 110% if you want to give it a try.


----------



## OnkelB91

I'm just getting "Your graphic card no need to update VBIOS!" when using the ASUS tool on my TUF OC. Is there any dump of the new BIOS for manual flashing available anywhere?


----------



## MrBridgeSix

OnkelB91 said:


> I'm just getting "Your graphic card no need to update VBIOS!" when using the ASUS tool on my TUF OC. Is there any dump of the new BIOS for manual flashing available anywhere?


Extract the .exe with 7z.


----------



## phoenixyz

OnkelB91 said:


> I'm just getting "Your graphic card no need to update VBIOS!" when using the ASUS tool on my TUF OC. Is there any dump of the new BIOS for manual flashing available anywhere?


create a folder. paste nvflash on that folder . go to techpowerup and download the matching bios u want for your card. past the bios in the same folder with nvflash. Then open cmd in admin mode and then cd to the directory were the rom is. disable protection. Then backup your original bios before flashing in the new bios .After that reenable bios protection


----------



## man from atlantis

I just flashed my Palit Gamerock with OC bios, power limit increased from 400 to 440W.

Stock Gamerock bios at (1905-1950MHz)/9500MHz, 340W, Port Royal result is 11421
Gamerock OC bios at +100MHz/10875MHz, 440W, Port Royal score is 12431

I also have an undervolt profile, [email protected]/10875MHz, ~260W, Port Royal score ~11441
another profile for casual, [email protected]/10875MHz, 390W, Port Royal score ~12200

VRAM scaling is pretty fine(probably thanks to VRAM heatsink) It doesn't hit error correction until 10875MHz, 11000MHz has similar or worse sometimes. but 10750MHz always have worse score than 10875MHz. So I settled at 10875MHz.

9500MHz, Port Royal score, 11421
9750MHz, Port Royal score, 11485
10000MHz, Port Royal score, 11545
10250MHz, Port Royal score, 11582
10500MHz, Port Royal score, 11622
10750MHz, Port Royal score, 11656
10875MHz, Port Royal score, 11667
11000MHz, Port Royal score, 11684

This is so fresh, I'll probably need to fine tune in time little by little


----------



## VPII

rankftw said:


> So I flashed the Amp Holo BIOS to my Palit Gaming Pro and although it said that the power limit was now increased to 374w it still hit the performance cap at around 340w. Think the card may be hardware locked at that.


Yup, from what I have seen with my Palit I had before it was hardware locked with I think load balancing.


----------



## tsamo

Hello to everyone. I have a Gigabyte Vision 3080 with a power limit of 350W, which bios exactly has people used to get access to those last watts?
From what people said the TUF bios is the one, but which one of them? I thought the last one but when unzipped it has 11 bios files. Am I missing something?
Thanks in advance!


----------



## Klogarg

tsamo said:


> Hello to everyone. I have a Gigabyte Vision 3080 with a power limit of 350W, which bios exactly has people used to get access to those last watts?
> From what people said the TUF bios is the one, but which one of them? I thought the last one but when unzipped it has 11 bios files. Am I missing something?
> Thanks in advance!


I also have a Gigabyte Vision with the newest bios and is supposed to have a 370W limit based on the info given by GPU-Z, but even with that it caps at 350 and throttles every time it goes beyond that point, you can probably update it with the Aorus Engine, but not sure if it stills limited in someway from the bios.

Any recomendations in a compatible bios for a 2 connector PCB I can try?


----------



## outofmyheadyo

What is the best Bios for 3080 TUF OC these days?


----------



## NRockwell

Hi, I have Gigabyte RTX 3080 Eagle OC with 340W Power Limit and no option to change.
My bios: Gigabyte RTX 3080 VBIOS

Is there any way to change swap bios from other model?


----------



## nam3less

Got my Strix installed, upgrading from Ventus. Max voltage is 1081 and doesn’t even hit power limiter at that voltage. Seems like I got an “ok” to above average chip. At stock on OC setting via GPU tweak it’s odd. It doesn’t even boost past 1980 even though rated is 1935. It’s like whatever Asus did disabled full boosting. I was able to get core to 2145 but it doesn’t hold. In PR, it’s an immediate throttle down to 2100 and even 2085.

Anyway, mem doesn’t go above 700 without crash which is disappointing. It did stop increasing at 700. Going 725, PR score dropped by 100. Max score was 12362 and I can’t be assed to go higher. I know I could probably squeeze this for all it’s worth up to 12,4xx or even 500, but I don’t want to spend the time.

Final stable daily driver gaming settings are 2100 core and +500 mem. This is stable in everything I’ve tested, including Warzone with rtx. I have fans on a custom silent curve so it hovers at 2070-2100 because I don’t blast the fans. 2100 on air is great. Certainly better than my Ventus and not worrying about power limit is GREAT. Is that worth $100 more? Ehhhhhhhh.


----------



## obscurehifi

Hi all, I'm new here and this is my first post, so go easy on me ;-)

I have had the Gigabyte AORUS 3080 Waterforce for a few days now. Big jump in performance from my 1080 OC!

Anyways, I've been doing some testing and seems I'm hitting the power limit quite early. The stock bios is 370W but seems to hit the power limit at closer to 352W. This is with the power limit set to 100% (everything stock). Every now and then there's a spike up into the 360's.

What is nice about this one is I'm definitely not hitting thermal limits being water cooled. I'm maxing out under load around 59 to 60C and it's super quiet with the fans only running at 35%/1173rpm under full load. It's also keeping my cpu cooler since it's not dumping hot air inside the case.

My first question to the community is about the bios. The AORUS cards have 3 display port and 3 hdmi connections. Will they all still work with a bios from a different manufacturer that only has a total of 4 or 5 connections?


----------



## ducegt

ZealotKi11er said:


> I am trying to figure out why people are so upsest with pushing 320W card to these crazy level. The gains seem minimal to me. I know its OCN but dam these cards suck power. Also stuff that 3080 cant cope with, 10% more perf will not help it.


Haha! Wait until you get your hands on the card and I'm sure you'll toss that sound reasoning out the window!


----------



## obscurehifi

Here are some results in my adventures of overclocking and undervolting. Although I'm boosting and limiting voltage, so hopefully that's still referred to as undervolting.

Card: Gigabyte AORUS 3080 Waterforce stock 370W bios 94.02.26.48.AB but power limits between 345 and 350W (Why only 350?!). One of the reasons I opted for this card are the 6 display connections and water cooler. It's so quiet!
CPU: 3800x PBO Level 1
Display: 2560x1440p G7, G-sync disabled

*Stock Settings (power limited at 350W)*
Time Spy --->16089 with 17978 graphics score https://www.3dmark.com/3dm/55092052
Port Royal --->11655 https://www.3dmark.com/3dm/55046095
Fire Strike Extreme --->19816 https://www.3dmark.com/3dm/55047226
Shadow of Tomb Raider Highest Setting RT Shadows off --->21152 Ave FPS 134
Superposition 4k Optimized, Textures High, DOF enabled, Motion Blur enabled ---> 25553 max 57degC

*Undervolted and OC'd at 993mV / 2070Mhz gpu, +0 Mem (still power limited at 350W)*
Time Spy --->16326 with 18320 graphics score https://www.3dmark.com/3dm/55159217
Port Royal --->11839 https://www.3dmark.com/3dm/55153802
Fire Strike Extreme --->20228 https://www.3dmark.com/3dm/55154093
Shadow of Tomb Raider Highest Setting RT Shadows off --->21324 Ave FPS 136
Superposition 4k Optimized, Textures High, DOF enabled, Motion Blur enabled ---> 25960 max 57degC

Here's the curve I used:








I tried some lower voltages and lower frequencies but things started crashing. But it seems stable so far at the 993mv / 2070Mhz.

How do my numbers look? Any suggestions?


----------



## ssgwright

ok I just tried soldering my shunts and yeah... I can't solder, I gave up. The only next best thing I can think of to do is the silver paint, problem is I'm in Hawaii and I can't find it locally and no one is allowed to ship it to me. HELP


----------



## Falkentyne

ssgwright said:


> ok I just tried soldering my shunts and yeah... I can't solder, I gave up. The only next best thing I can think of to do is the silver paint, problem is I'm in Hawaii and I can't find it locally and no one is allowed to ship it to me. HELP


Soldering is quite easy if you've soldered before. Did you use Flux?
Follow this video and you'll be able to solder. The SECRET is to 1) Flux, 2) drop solder on top of flux, 3) FLUX AGAIN ON TOP OF THE SOLDER!, 4) apply shunt on top of solder/flux, 5) Melt (apply light downwards pressure with Engineers tweezers or something small and solid). You'll notice it's the flux that directs the solder.






You could try ordering this stuff? Might be easier to get than the MG Chemicals stuff.
(Don't use the circuitwriter pen--it doesn't work on Ampere very well).



https://www.amazon.com/gp/product/B00KBXT6JW/


----------



## ssgwright

ya I did use flux... I think the resistor metal oxidized too fast so the solder wouldn't stick... I dunno... ok I just ordered what you recommended, hopefully that works.

thanks for the help!


----------



## Falkentyne

ssgwright said:


> ya I did use flux... I think the resistor metal oxidized too fast so the solder wouldn't stick... I dunno... ok I just ordered what you recommended, hopefully that works.
> 
> thanks for the help!


Could possibly be the wrong flux? The solder shouldn't just fall off unless it was not the right flux.


----------



## ssgwright

this is what i used:


----------



## Falkentyne

ssgwright said:


> this is what i used:
> View attachment 2470235


I don't know much about soldering but aren't you supposed to use something like this?



 https://www.amazon.com/gp/product/B008ZIV85A/



I know there's two kinds of flux, there's "no clean flux" and "Rosin flux paste" but I don't know anything about this stuff.


----------



## acoustic

EVGA Hybrid cooler for my 240mm will be here in time for XMas. Looking forward to it


----------



## ssgwright

Falkentyne said:


> I don't know much about soldering but aren't you supposed to use something like this?
> 
> 
> 
> https://www.amazon.com/gp/product/B008ZIV85A/
> 
> 
> 
> I know there's two kinds of flux, there's "no clean flux" and "Rosin flux paste" but I don't know anything about this stuff.


not sure I'm new to soldering myself


----------



## Falkentyne

ssgwright said:


> not sure I'm new to soldering myself


Try the Rosin paste then.
Because the solder should have attached to the flux then hardened without dropping off the shunt.
I'm not sure if there's any 'conformal coating' on top of the original shunts on the silver edges, but sometimes it helps to scrape that off also with a small flathead.


----------



## akkuman

obscurehifi said:


> I have had the Gigabyte AORUS 3080 Waterforce for a few days now. Big jump in performance from my 1080 OC!


Same here. I flashed the TUF Bios but get lower Clocks. Went back to the O-Bios.


----------



## obscurehifi

akkuman said:


> Same here. I flashed the TUF Bios but get lower Clocks. Went back to the O-Bios.


Was it the TUF or TUF OC bios? What power limit are you getting?


----------



## obscurehifi

I'm realizing my motherboard might be limiting the PCIE power. It should be up to 75W and I'm only getting 60W. I'm getting up to 306W total through my 8-pins from my EVGA 850W G2. In total I'm getting just under 360W, as the three powers fluctuate a bit. Even then, seems like the card should know it doesn't need to power limit, unless it's sensing it's against the limit of what the mobo can supply. I have the ASUS TUF x570-PLUS WIFI.

I'm having really good results with the voltage/clock curve I posted above, with 99.8% frame rate stability. I just finished the Port Royal stress test. The GPU temp stabilized at 57 deg C with the fans ramping up less than 1600rpm.
https://www.3dmark.com/3dm/55176026

The lower load line is my CPU.


----------



## manjooie

Hey, just wondering if anyone has flashed a BIOS to the EVGA 3080 XC3 with any luck of higher power limit. I can't get mine to pull above 330w


----------



## iunlock

*Cyberpunk 2077: 9900KS | RTX 3080 | 21:9 - 4K (3840x1600)*

Here are the quick preset: ULTRA settings that give me 60 FPS in the game, another screen shot showing the 35 FPS with DLSS: OFF, and another showing 56 FPS with DLSS: Quality.

I picked this specific location on purpose as it is pretty consistent overall with a lot of details on the screen.

CPU: 5.3GHz
System RAM: 11,000MB+
GPU VRAM: ~9500MB

Note the resolution that I'm gaming at. This is 21:9 - 4K.

*Ray Tracing Ultra: Preset*

Only changes:
Film Grain: *OFF*
RT: Reflections *ONLY*
RT Lighting: *ULTRA*
DLSS: *OFF *and *AUTO *and *QUALITY

60 FPS w/ DLSS: AUTO









56 FPS w/ DLSS: QUALITY









35 FPS w/ DLSS:* *OFF







*


----------



## c0nsistent

Well... I tried the shunt mod on my XC3 3080 but I tried the hot glue method that I saw on YT and yeah... no. The connection must not be strong enoughwithout solder because I'm not seeing any changes to my TDP.

I'm not sure if I want to solder yet, as that is a big step to take and a risky one. 

Isn't there some sort of conductive paint that can be used to attach the resistors for stacking?


----------



## akkuman

obscurehifi said:


> Was it the TUF or TUF OC bios? What power limit are you getting?


around 340W with 0C and +15% Powerlimit slider.


----------



## Falkentyne

c0nsistent said:


> Well... I tried the shunt mod on my XC3 3080 but I tried the hot glue method that I saw on YT and yeah... no. The connection must not be strong enoughwithout solder because I'm not seeing any changes to my TDP.
> 
> I'm not sure if I want to solder yet, as that is a big step to take and a risky one.
> 
> Isn't there some sort of conductive paint that can be used to attach the resistors for stacking?


Yes, MG 842AR silver paint. But please check the shunts and see if the silver edges are lower than the black middle housing or the same level (flush). Flush shunts are MUCH EASIER to use the silver paint on than depressed edges shunts (Founders Edition shunts are depressed edges).

Make sure you scrape the conformal coating off the edges with a small flat blade screwdriver, off the silver edges carefully, before applying paint to them. This is important. Take your time very slowly so you don't scrape the PCB. You can tape around the shunt with Super 33+ tape if that helps (this is also useful for when applying paint too!!!). Note: for soldering, you should use polymide kapton high temp tape around the shunts, not super 33+ tape.


----------



## man from atlantis

Palit GameRock Vanilla at GameRock OC BIOS,
Horizon Zero Down;

[email protected] GPU, 2719MHz MCLK, ~245W, 77FPS 3840*2160, 43C,
+0 (1995-2025MHz) GPU, 2375MHz MCLK, ~360W, 78FPS 3840*2160, 51C,
[email protected] GPU, 2719MHz MCLK, ~390W, 82FPS 3840*2160, 52C,
+75 (2100-2130MHz) GPU, 2719MHz MCLK, ~410W, 83FPS 3840*2160, 53C,
[email protected] GPU, 2719MHz MCLK, ~410W, 83FPS 3840*2160, 53C.


----------



## StreaMRoLLeR

BluePaint said:


> Cooling the caps on the back of the card with a high RPM fan really helps me keeping +2150 Mhz in Cyberpunk. With less fan cooling like I had before, it will crash occasionally even < 2100.
> 
> SoldierRBT mentioned before that cap cooling can make a difference of 15-30 Mhz for benchmarks. In my case for gaming it seems to make a difference of almost 100Mhz.
> 
> That makes we cant to cap mod my MSI gaming because it only has a single MLCC cap. The higher end MSI Suprim X, which uses exactly the same PCB, uses 4 MLCCs.
> 
> Does anyone have the exact specs (so that I can find it in an online shop) for MLCCs which can be used for that? I am not really familiar with those things and I had trouble identifying the exact type.
> 
> View attachment 2469696



Yes we found out together with SoldierBRT that reducing VRM and MLCC temps greatly stabilize overclock. Also apply thermalpads to back side of PCB to increase efficiency. I would recommend setting for 2130 and less aggresive voltage. It doenst make any fps difference. I even tried with 2235 mhz


----------



## omarrana

obscurehifi said:


> Here are some results in my adventures of overclocking and undervolting. Although I'm boosting and limiting voltage, so hopefully that's still referred to as undervolting.
> 
> Card: Gigabyte AORUS 3080 Waterforce stock 370W bios 94.02.26.48.AB but power limits between 345 and 350W (Why only 350?!). One of the reasons I opted for this card are the 6 display connections and water cooler. It's so quiet!
> CPU: 3800x PBO Level 1
> Display: 2560x1440p G7, G-sync disabled
> 
> *Stock Settings (power limited at 350W)*
> Time Spy --->16089 with 17978 graphics score https://www.3dmark.com/3dm/55092052
> Port Royal --->11655 https://www.3dmark.com/3dm/55046095
> Fire Strike Extreme --->19816 https://www.3dmark.com/3dm/55047226
> Shadow of Tomb Raider Highest Setting RT Shadows off --->21152 Ave FPS 134
> Superposition 4k Optimized, Textures High, DOF enabled, Motion Blur enabled ---> 25553 max 57degC
> 
> *Undervolted and OC'd at 993mV / 2070Mhz gpu, +0 Mem (still power limited at 350W)*
> Time Spy --->16326 with 18320 graphics score https://www.3dmark.com/3dm/55159217
> Port Royal --->11839 https://www.3dmark.com/3dm/55153802
> Fire Strike Extreme --->20228 https://www.3dmark.com/3dm/55154093
> Shadow of Tomb Raider Highest Setting RT Shadows off --->21324 Ave FPS 136
> Superposition 4k Optimized, Textures High, DOF enabled, Motion Blur enabled ---> 25960 max 57degC
> 
> Here's the curve I used:
> View attachment 2470222
> 
> I tried some lower voltages and lower frequencies but things started crashing. But it seems stable so far at the 993mv / 2070Mhz.
> 
> How do my numbers look? Any suggestions?


I dont understand on watercooling why do you need to undervolt?


----------



## Muqeshem

interesting topics guys. 
I bought the evga ftw ultra and looking forward to play with it.
What is the best rtx 3080?
and how good is the evga ftw ultra ?
How can I improve it is performance ? Any tips and tricks ?


----------



## rioja

Muqeshem said:


> What is the best rtx 3080?


Must be Strix 3080 then FTW3 along with Aorus Extreme


----------



## StreaMRoLLeR

Muqeshem said:


> interesting topics guys.
> I bought the evga ftw ultra and looking forward to play with it.
> What is the best rtx 3080?
> and how good is the evga ftw ultra ?
> How can I improve it is performance ? Any tips and tricks ?


strix ftw3 and suprim-x 

Strix have CW and fan bearing problem

Aorus extreme is 1 mlcc 5 poscap.


----------



## delreylover

rankftw said:


> So I flashed the Amp Holo BIOS to my Palit Gaming Pro and although it said that the power limit was now increased to 374w it still hit the performance cap at around 340w. Think the card may be hardware locked at that.


Hello, have you noticed any significant performance gains (in benchmarks or games)?
Thank you


----------



## rioja

Streamroller said:


> strix ftw3 and suprim-x
> Strix have CW and fan bearing problem
> Aorus extreme is 1 mlcc 5 poscap.


Suprim 16 power + 4 mem stages, not the best combination and there must be a reason they rate it at only 370w
I also have a feeling that Strix coil whining not so common anymore probably first batches issue (I hope)


----------



## iunlock

Streamroller said:


> Yes we found out together with SoldierBRT that reducing VRM and MLCC temps greatly stabilize overclock. Also apply thermalpads to back side of PCB to increase efficiency. I would recommend setting for 2130 and less aggresive voltage. It doenst make any fps difference. I even tried with 2235 mhz


That is true. A higher overclock on the Core doesn't necessarily translate to higher FPS. For stock blowers there is a sweet spot (limit) where it'll settle and no matter how much you OC it won't make a difference... diminishing returns actually due to the thermals.

Even with a water block the same principle applies which is all in ratio to the thermals, that will dictate the scaling of the core. I've peaked at 2220MHz on the stock 3080, but that was just for fun. On the stock blower the ideal OC sits around 2160MHz for me, which is the limit on air.

I have a water block for the 3080/3090, but I haven't installed it on the 3080 since I'm reserving it for the 3090 that arrives tomorrow.



Muqeshem said:


> interesting topics guys.
> I bought the evga ftw ultra and looking forward to play with it.
> What is the best rtx 3080?
> and how good is the evga ftw ultra ?
> How can I improve it is performance ? Any tips and tricks ?



I always get EVGA every gen; however, I also bought MSi and Asus cards, which have been just as good of a bin as EVGA. In fact, a lot of my records are with MSi cards on both air and water. This gen with the 30 Series, you really can't go wrong with any of the three... at this point it's really about whatever you can get your hands on.


----------



## VPII

rioja said:


> Suprim 16 power + 4 mem stages, not the best combination and there must be a reason they rate it at only 370w
> I also have a feeling that Strix coil whining not so common anymore probably first batches issue (I hope)


The SUprim X is 370 watt stock up to 430 watt, trust me I know as I used the bios on my Gaming X Trio


----------



## phoenixyz

VPII said:


> The SUprim X is 370 watt stock up to 430 watt, trust me I know as I used the bios on my Gaming X Trio


The best bios for gaming x tri o is the strix bios. It's 450 watts. I got a 15% bump in perf OC to OC just moving from stock bios to strix bios.


----------



## rankftw

delreylover said:


> Hello, have you noticed any significant performance gains (in benchmarks or games)?
> Thank you


I didn't notice any difference.


----------



## obscurehifi

omarrana said:


> I dont understand on watercooling why do you need to undervolt?


Watercooling helps the thermals for sure while keeping things really quiet. It definitely removes the possibility of throttling from temp. Unfortunately, power is power and you can still hit the power limit even if it's cooler. Lower temp still helps, probably by making it a little more efficient. Pulling heat away better should allow for more of the things that generate the heat. This card is rated for 370W of power but it throttles closer to 350 to 360, so I'm trying to make it throttle differently, or less hard off the limit. On a side note, I'm not sure if it's the card that limiting the power limit, or my pci-e slot power supply, since I'm getting the full rated 300W from my main power supply at the 8-pin connectors.

The variables that create heat are voltage, current, and frequency. I suppose load factors in too. There are probably some other factors. It seems power goes up linearly with frequency from what I've read. Then ohms law will tell you power goes up by the square of voltage. So by keeping power constant and reducing voltage, you should be able to add more frequency. More frequency means more performance but it has to be balanced with voltage. I did some tests today and found that voltage closer to 0.850 can't take as much frequency before they crash on Firestike Ultra looping. The opposite is true that voltage closer to the limit of 1.08ish generates too much heat (power) and it can't get high frequencies either, and the card will just throttle and fall down the frequency/voltage curve. Then, depending in the curve, it'll get less frequency than an undervolted/boosted curve. So, I find there's a sweet spot between limiting voltage, heat generated, stability, and speed.

I also find that my card is more stable with less studder between 320 and the 350/360 limit my card has. All the ways that I got less than 300W, performance really seemed to tank visually at least at 4k benchmarking. 270 and 280W looked less than optimal. It crashes easier at lower voltages and generates power and heat at higher voltages (but is stable because the card just falls down the curve, dropping voltage and frequency).

I tried a different voltage frequency curve today increasing all the way until the voltage limit. I think it was set to around 2200 at 1.08V. It ran pretty good but the power limit throttled the card back. The 2070 at .993V undervolt curve I posted earlier actually gave higher scores. I'm starting to wonder what this does to less than full load situations.

Anyways, those are my theories and conclusions for the weekend! I'm sure I'll learn some more as I play more.

Cheers


----------



## Flisker_new

Hello,

It's been a long time since I upgraded GPU (1080 Ti), I would just like to ask, is there any way to mod bios as it used to be possible ? Like disable the power limit completely ?

I have the EVGA FTW3 Ultra, found the XOC bios which is great, but was curious if that's the best one can do these days.

Thanks o/


----------



## ZealotKi11er

Hi guys

I am trying to OC ASUS TUF 3080 non-OC model.
MSI AB has power slider to 117%. The issue is that in game it does not go to 117. It hovers 105-112 but the limiting factor is always PWR. Is this normal?


----------



## josephimports

ZealotKi11er said:


> Hi guys
> 
> I am trying to OC ASUS TUF 3080 non-OC model.
> MSI AB has power slider to 117%. The issue is that in game it does not go to 117. It hovers 105-112 but the limiting factor is always PWR. Is this normal?


Yes I've noticed the same behavior on the OC version of the TUF. Mine runs best at 2010mhz .925v which typically runs 1995-1980 under load. PL maxed at 110%. Barrow block. Silicon wasn't the best.


----------



## VPII

phoenixyz said:


> The best bios for gaming x tri o is the strix bios. It's 450 watts. I got a 15% bump in perf OC to OC just moving from stock bios to strix bios.


Due to fan speed difference on the Strix it does not work well with my Gaming X Trio, the Suprim X is much much better as fan speed similar to Gaming X Trio.


----------



## zlatanselvic

Best bios for ventus 3080 anyone?


----------



## Miro75

zlatanselvic said:


> Best bios for ventus 3080 anyone?


Unfortunately, the stock one + shunt mod. It gives around 410W TDP. Without the mod Iv tried all BIOSes available and always hitting 320W powerlimit. Correct me, if I'm wrong.


----------



## VPII

Miro75 said:


> Unfortunately, the stock one + shunt mod. It gives around 410W TDP. Without the mod Iv tried all BIOSes available and always hitting 320W powerlimit. Correct me, if I'm wrong.


You are right, there is a couple of 3080's that seems to have load balancing limit power to the stock bios power even with a bios update.


----------



## man from atlantis

Palit GameRock OC, feed by Seasonic SS-1000XP (c.2012)

Line voltage drop over 16AWG 8-pin connectors and motherboard, idle to 440W load. No daisy chain cabling or extension cords.

PCIe -1.49%
8-pin #1 -0.73%
8-pin #2 -1.06%
8-pin #3 -0.73%


----------



## Miro75

VPII said:


> You are right, there is a couple of 3080's that seems to have load balancing limit power to the stock bios power even with a bios update.


Anyway ... tried shunt modding with 5mohm, 8 nad 10. Forget about 5mohm. 8/10 is the best option. It gives you another 90...100w. So finally you'll be able to reach 410..420W. Good enough. Much better then rubbish 320.


----------



## man from atlantis

Some Flir shots back of the various cards from Korean 퀘이사존

ASUS TUF Gaming RTX 3080 O10G OC 10GB (340/375W)









GIGABYTE RTX 3080 Gaming OC 10GB (370/370W)









GIGABYTE AORUS Xtreme RTX 3080 10GB (350/370/370W)









PALIT RTX 3080 GAMEROCK OC 10GB (340/370/440W)









GAINWARD RTX 3080 GS OC 10G (320/350W)









INNO3D iChiLL RTX 3080 10GB X3 (340/340W)


----------



## zlatanselvic

Miro75 said:


> Anyway ... tried shunt modding with 5mohm, 8 nad 10. Forget about 5mohm. 8/10 is the best option. It gives you another 90...100w. So finally you'll be able to reach 410..420W. Good enough. Much better then rubbish 320.



I saw better performance and boost behavior with the TUF OC bios.


----------



## Tergon123

Talon2016 said:


> https://www.3dmark.com/3dm/51349539
> 
> 
> -- 19,680 GPU in Time Spy.
> 
> Asus Strix OC vBIOS just made the FTW3 3080 the card to get IMO. With the Asus vBIOS it is basically shunt modding itself and hugely under reporting power draw and the card holds crazy high boost now. I just managed to score the #1 spot in the US with this vBIOS on my FTW3 Ultra. Reported power draw under max overclock and max fans was around 330w. It's under reporting but the card performance is still scaling and the clocks are boosting to over 2100Mhz and holding near 2100Mhz the entire TimeSpy run. Not quite sure of how or why it's doing this yet, but its working and working well .
> 
> Fans run at same 3000rpm max so no issues there. One DP was deactivated, but haven't tried HDMI. Works for me as I use 2 DP ports for my monitors and they work.
> 
> Asus Strix OC 3080 vBIOS shared by a nice new owner over at reddit. They did us a solid!
> 
> *Asus Strix OC 3080 vBIOS*
> 
> 
> 
> 
> 
> 
> 
> 
> File on MEGA
> 
> 
> 
> 
> 
> 
> 
> mega.nz


Will this work on MSI SuprimX as well?


----------



## Muqeshem

yeah so I was doing benchmarks with my evga ftw ultra 









I scored 18 602 in Time Spy


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com





This was achieved with +135mhz in core and +950mhz in mem using the 450watt bios.


----------



## StreaMRoLLeR

No need to flash strix bios to ftw3. LMAO FTW3 have proper XOC 450W already.


----------



## man from atlantis

Is there anyway to achieve zero fan mode with custom fan curve in afterburner? It doesn't go below 30%.


----------



## Tergon123

How do you flash with mismatched vendor on 30 series card?


----------



## ssgwright

--protectoff


----------



## Tergon123

ssgwright said:


> --protectoff


Will I be able to flash back to the original file, I am guessing I would have to use --protectoff again as well?


----------



## ScorpMCP

Is it worth getting a waterblock for a Asus TUF OC, or should I hold out for a 3x8 pin card?


----------



## Tergon123

ssgwright said:


> --protectoff


Thought command line protect off wasn't working anymore, what is the exact version of nvflash everyone is using for the 30 series cards going mismatch.


----------



## phoenixyz

ScorpMCP said:


> Is it worth getting a waterblock for a Asus TUF OC, or should I hold out for a 3x8 pin card?


No need. Hold out for an ftw 3 ultra or suprim


----------



## joyzao

Hi Guys

I have a rtx 3080 tuf oc, which bios could I put? What is the safest out of the original bios?


----------



## ssgwright

Tergon123 said:


> Thought command line protect off wasn't working anymore, what is the exact version of nvflash everyone is using for the 30 series cards going mismatch.


ya --protectoff works, and yes you have to do it again to flash back


----------



## ssgwright

NVFLASH: NVIDIA NVFlash (5.670.0) Download


----------



## Pedros

Ok, new entry here 

MSI 3080 Suprim X. It was more an occasion buy because stocks were so bad that this was the first my "go-to store" presented me with and what a big "mama" she is ... the tubings of my AIO are a tight fit  ( yeah, don't really focus on the RGB it's all off now and testing that mem cooler to have a little more airflow on the 4 sticks ).

Some results from with a custom curve and +632 on mems:
Firestrike (not really happy with these results since I see many other results on the 40k) :
NVIDIA GeForce RTX 3080 video card benchmark result - AMD Ryzen 9 5950X,Micro-Star International Co., Ltd. MEG X570 UNIFY (MS-7C35) (3dmark.com)

Firestrike Ultra:
NVIDIA GeForce RTX 3080 video card benchmark result - AMD Ryzen 9 5950X,Micro-Star International Co., Ltd. MEG X570 UNIFY (MS-7C35) (3dmark.com)

Port Royal:
NVIDIA GeForce RTX 3080 video card benchmark result - AMD Ryzen 9 5950X,Micro-Star International Co., Ltd. MEG X570 UNIFY (MS-7C35) (3dmark.com)

















So, for this specific card, anyone tried any other bios with success? I know this already allows 430W (+116%) on the power limit... but you know, never stop tinkering...

Another question, did anyone tried if we can mount the G12 bracket from NZXT with the 3080s? If so, it was worth the hassle?


----------



## VPII

Pedros said:


> Ok, new entry here
> 
> MSI 3080 Suprim X. It was more an occasion buy because stocks were so bad that this was the first my "go-to store" presented me with and what a big "mama" she is ... the tubings of my AIO are a tight fit  ( yeah, don't really focus on the RGB it's all off now and testing that mem cooler to have a little more airflow on the 4 sticks ).
> 
> Some results from with a custom curve and +632 on mems:
> Firestrike (not really happy with these results since I see many other results on the 40k) :
> NVIDIA GeForce RTX 3080 video card benchmark result - AMD Ryzen 9 5950X,Micro-Star International Co., Ltd. MEG X570 UNIFY (MS-7C35) (3dmark.com)
> 
> Firestrike Ultra:
> NVIDIA GeForce RTX 3080 video card benchmark result - AMD Ryzen 9 5950X,Micro-Star International Co., Ltd. MEG X570 UNIFY (MS-7C35) (3dmark.com)
> 
> Port Royal:
> NVIDIA GeForce RTX 3080 video card benchmark result - AMD Ryzen 9 5950X,Micro-Star International Co., Ltd. MEG X570 UNIFY (MS-7C35) (3dmark.com)
> 
> View attachment 2470710
> View attachment 2470709
> 
> 
> 
> 
> So, for this specific card, anyone tried any other bios with success? I know this already allows 430W (+116%) on the power limit... but you know, never stop tinkering...
> 
> Another question, did anyone tried if we can mount the G12 bracket from NZXT with the 3080s? If so, it was worth the hassle?


Yes you can, I did however need to modify the G12 bracket at the bottom when fitted to my Paling Gamingpro OC, but it would not have been the same for the MSI Gaming X Trio I have now. Whether it is worth it, is difficult to say. I found that I got a slight decrease in power but the temps for the vram resulted in me reducing the overclock to almost nothing. If you have small heatsinks to mount on the vrm and vram with an active fan blowing over it, it may work, but in all honesty it is not really worth it.

The clock speed drop due to temps for the 3080 is a lot less aggresive when compared to the 2080 ti I had before.


----------



## Muqeshem

Tergon123 said:


> Will this work on MSI SuprimX as well?


yeah no need. just use the evga bios, i got 19896 in time spy with evga ftw ultra with the evga bios and not strix.


----------



## Tergon123

ssgwright said:


> ya --protectoff works, and yes you have to do it again to flash back


Thanks and thanks again for the Download link as well. Which is best FTW 450 or Strix OC bios anyone with any thoughts?


----------



## man from atlantis

Since I'm done with benchmark OC settings at 100% fan speed, I'm on to find stable gaming clocks with using realistic fan curve and noise levels. Playing Wolcen at 5120*2160 (4K Ultrawide) is quite heavy on the GPU seems to me moreso than CP77(RT Psycho ~2070MHz) and HZD (4K I'm over 2130MHz). I'm at constant 420-435W GPU power. Temperature was highest 67C with custom fan curve after 2hr game session, and the settled clock speed is 2025MHz. Underclock values are;
[email protected], ~280W, 60C default fan curve, 1hr session.
[email protected], ~340W, 65C default fan curve, 1hr session.

Next I'm gonna try Q2RTX, heard it's heavy on the GPU more so than other games as well.


----------



## BluePaint

Pedros said:


> Firestrike (not really happy with these results since I see many other results on the 40k) :
> NVIDIA GeForce RTX 3080 video card benchmark result - AMD Ryzen 9 5950X,Micro-Star International Co., Ltd. MEG X570 UNIFY (MS-7C35) (3dmark.com)


Its mostly the temps i think. I have 40400 on fs but gpu temps are 20c lower on avg (cold air cooled). Also, my best result is with smt off and fixed 5ghz oc, which gives lower physics score but better combined score which seems to weight heavily in total score. Its a trio with strix oc bios on open testbench. https://www.3dmark.com/fs/24383800


----------



## elox

Did some testing with my Gainward Phoenix 3080. ASUS TUF OC Bios works fine and gives me best results so far but card seems hardlocked @ 350w. 
Since TUF OC bios has 340w default I get a little bit better results with UV then with gainward bios. 0.935v with boost ~2040mhz in Cold war and 1900mhz in Firestrike extrem.
Time Royal: 11270pts
Firestrike Extreme: 19.990
Not the best card but no suprise.


----------



## cstkl1

latest v2 asus bios is more than just fan fixing

455.56 drivers.. nvcpl not detected in msi ab . total blank reading. cannot adjust anything


----------



## josephimports

cstkl1 said:


> latest v2 asus bios is more than just fan fixing
> 
> 455.56 drivers.. nvcpl not detected in msi ab . total blank reading. cannot adjust anything


Notice any performance improvements with the v2 bios?


----------



## man from atlantis

Asus Strix OC, EVGA FTW3 Ultra, Gigabyte Aorus Xtreme, MSI Suprim X, Palit GameRock OC

Bios power table comparison


----------



## ScorpMCP

phoenixyz said:


> No need. Hold out for an ftw 3 ultra or suprim


Thanks  I was actually able to snatch an evga ftw3 ultra just 10 minutes ago, getting it monday, so I guess ill sell the TUF , woohoo!


----------



## nyk20z3

People still Thirsty for a 3080 when the Ti is about to drop? Patience people before you regret it later.


----------



## cstkl1

josephimports said:


> Notice any performance improvements with the v2 bios?


nothing to be honest


----------



## obscurehifi

man from atlantis said:


> Asus Strix OC, EVGA FTW3 Ultra, Gigabyte Aorus Xtreme, MSI Suprim X, Palit GameRock OC
> 
> Bios power table comparison


Thanks for posting these. Do the Asus and Gigabyte cards shown in your screenshots actually pull 90W from the slot?? 

Sent from my SM-G973U using Tapatalk


----------



## man from atlantis

obscurehifi said:


> Thanks for posting these. Do the Asus and Gigabyte cards shown in your screenshots actually pull 90W from the slot??
> 
> Sent from my SM-G973U using Tapatalk


That should be the max allowed power limit for pci-e slot, not actual pull values.


----------



## cstkl1

man from atlantis said:


> That should be the max allowed power limit for pci-e slot, not actual pull values.


what tool is dat dude. 

also i wonder will this show the issue with asus cards and quake rtx ii

quake rtx ii always at 1v 
most asus cards this is 1980/2010
if u oc. it will downvolt looking for that 1980/2010

but evga doesnt seem to have this problem


----------



## obscurehifi

man from atlantis said:


> That should be the max allowed power limit for pci-e slot, not actual pull values.


I understand it's not actual pull values but it certainly looks like it sets the limit to what the bios will allow the card to pull.

I always thought all the video card makers would cap the pci-e slot limits to the 75W pci-e specification. I was just reading up on this topic recently and what I found was that there was some controversy back with the RX480 pulling more that 75W and up to 90W when overclocked. It even led to AMD issuing a statement and a bios fix to lower the draw back down. I guess there were users claiming it was damaging things.

Here's a nice study on the issue. The author reached out to motherboard manufacturers and they said sustained power draw from the slot of 90W would very likely cause damage.
Power Consumption Concerns on the Radeon RX 480 - PC Perspective

It should be up to the card bios to limit the power draw to be within specifications of the slot. I can understand pulling more than the 150W limit from the power supply over each 8-pin connector since that seems to have been highly tested at higher power limits but the slot power capability is completely up to the motherboard design and pulls from the 3V and 12V motherboard supplies internally, which could rob power from other components.


----------



## man from atlantis

cstkl1 said:


> what tool is dat dude.
> 
> also i wonder will this show the issue with asus cards and quake rtx ii
> 
> quake rtx ii always at 1v
> most asus cards this is 1980/2010
> if u oc. it will downvolt looking for that 1980/2010
> 
> but evga doesnt seem to have this problem


I'm not sure if that's a problem. Because Q2 RTX is really heavy on GPU. I'm running it at [email protected] and the game hits 440W almost constantly.


----------



## cstkl1

man from atlantis said:


> I'm not sure if that's a problem. Because Q2 RTX is really heavy on GPU. I'm running it at [email protected] and the game hits 440W almost constantly.


i cooled it lower. to 3x-4x it still lock to 1980-2010.. dropping to 0.9v

see soldierRBT video. he can go [email protected] his fixed at 1v

although his gpu is golden but he actually can oc without the game throttling to the voltage where 1980/2010 is..


----------



## DarthBaggins

nyk20z3 said:


> People still Thirsty for a 3080 when the Ti is about to drop? Patience people before you regret it later.


This is why I have been waiting, also availability has curbed my impulse buying as I refuse to pay above retail/MSRP for a card (always hunting for the next deal/steal).
1080Ti is still holding strong at 2100 in most games - Destiny 2 is the only title I have to remove my OC from thanks to one poorly optimized planet in the game. But I can't wait to be able to push my panel further or more consistently.


----------



## acoustic

The STRIX BIOS vs EVGA FTW3 BIOS .. looks like the STRIX allows more power to vRAM, no? I wonder if this is why STRIX cards seem to consistently have "good" clocking memory, while my EVGA, for example, suffers error correction at +550.

Might need to flash the STRIX BIOS again and see if it affects Memory OC. I never tried messing with the memory when I had the STRIX BIOS on the card (prior to EVGA releasing the 450w BIOS, this was the only way to get a higher power limit)


----------



## ducegt

Joining the club with a Gaming X Trio (it was in stock) that's flashed with the Suprim BIOS. I also just toyed with the Strix BIOS in part because of the differences shown in PL for MEM, but I didn't notice any difference. I can take MEM to +1200 before scoring regresses at 1300. Getting the voltage/frequency curve perfect seems very tedious because different game engines and even scenes make the GPU boost differently. My Trio can cool Suprim at the full 430w, but it takes about 90% fan speed on the Suprim BIOS which is a bit much and the benefits just aren't there to justify it. The Trio for me seems to handle 400w well at most. I honestly bought my card without knowing what brand or SKU it was lol.. every half second counts when you are racing to get it in the cart and checked out.

I've been able to get in various 3dmark Top 50 GPU-CPU 9900K combos and my best Port Royal is around 12,600 at ambient temps ~74F. Getting this thing OCed in games is very different and I'm trying not to be so greedy considering the 3080 is more than twice as fast as my Vega 64; and 1440p 240hz is the most exciting advancement I've had in at least 5 years.


----------



## ZealotKi11er

nyk20z3 said:


> People still Thirsty for a 3080 when the Ti is about to drop? Patience people before you regret it later.


What will that solve? It will most likely be $1000 so there is still a place for 3080.


----------



## ScorpMCP

ZealotKi11er said:


> What will that solve? It will most likely be $1000 so there is still a place for 3080.


----------



## Illusive Spectre

Port Royal Graphics score - 12601









I scored 12 601 in Port Royal


AMD Ryzen 5 3600, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com






RTX 3080 SUPRIM X OC settings:

+150 Core Clock offset
+1000 Memory Clock offset
Power limit is set to 116% (Max)
Fan speed is set to 100%

---

AMD Ryzen 5 3600 CPU - Everything in the BIOS is set to auto (Stock)
GSYNC is set to disabled
Windows 10 Version 20H2 - Hardware Accelerated GPU Scheduling is set to enabled
AMD Ryzen High Performance power plan


----------



## MichelitoSwiss

What better bios for gigabyte waterforce? has anyone tried with the extreme version?


----------



## Pedros

Damn Illusive ... you got a good sample ... my Suprim X "sucks"...
can't do 150 ... can't do 120 ... I'm at 104 and settled at 90 for daily drive. 

You don't refer but I guess your voltage slider is also maxed out on Afterburner right?

It's been a pattern, I never seem to get a good sample of a gpu with Nvidia 

Port Royal, i score 12 289 ...


----------



## Tergon123

Pedros said:


> Damn Illusive ... you got a good sample ... my Suprim X "sucks"...
> can't do 150 ... can't do 120 ... I'm at 104 and settled at 90 for daily drive.
> 
> You don't refer but I guess your voltage slider is also maxed out on Afterburner right?
> 
> It's been a pattern, I never seem to get a good sample of a gpu with Nvidia
> 
> Port Royal, i score 12 289 ...


Yeah mine is around the same, can't get it to run stable above 120 on the core. Odd with the Strix I could run all the way to 145, but the fans will not auto run proper with the Strix bios, so I would just run them 100% for testing purposes. Also noticed with Strix bios on MSI Suprim X, it did make any difference to my scores, in fact scores were worse running it, yes the slider numbers ran higher, core, PL, and memory, but it didn't actually perform any better.


----------



## Pedros

Yeah i mean, these are synthetics, for gaming, the will all be around the same...
When stock pops up I may try to resell mine and get a different one ... 

or not ... LOL something I learned is, when companies say it's a binned chip ... that means squat


----------



## nyk20z3

ZealotKi11er said:


> What will that solve? It will most likely be $1000 so there is still a place for 3080.


More vram etc, you always buy the strongest card possible so it has a longer shelf life. If your willing to spend $800 + on a 3080 then you would spend a little more on a Ti, this Gen seems to be heading in that direction with these stock issues. By the time there is normal stock and this Ti actually comes out, 69000XT etc then it will make more sense to go that route for the money IMO, just like all the hype around the 3080 20GB which was apparently axed. I just see a lot of people putting there cards up FS in the near future. In the meantime enjoy, should be a fun next few months for GPU's.


----------



## criminal

nyk20z3 said:


> More vram etc, you always buy the strongest card possible so it has a longer shelf life. If your willing to spend $800 + on a 3080 then you would spend a little more on a Ti, this Gen seems to be heading in that direction with these stock issues. By the time there is normal stock and this Ti actually comes out, 69000XT etc then it will make more sense to go that route for the money IMO, just like all the hype around the 3080 20GB which was apparently axed. I just see a lot of people putting there cards up FS in the near future. In the meantime enjoy, should be a fun next few months for GPU's.


I can't speak for everyone, but I paid MSRP $729 for my Asus TUF. I would not be willing to spend $1k for a 3080Ti, for just 10% more performance. VRAM is not a concern for me in the slightest. Plus, good luck getting a 3080Ti at all when it is released. My guess is that it will be the hardest card out of the product stack to actually find in stock and buy.


----------



## Pedros

3080Ti will not make a difference ... unless you are pushing some serious 4k pixels, 10gb still more than enough for the times to come...
Plus, there's a difference between spending X ... and X+200~300bucks 

By the time you get the 3080Ti, 3080 users had their fun for months and maybe, just maybe, are eyeballing what the 4000 series bring 

Depends on each use case.


----------



## SirCanealot

Has anyone tried flashing any different bioses to the Palit Gaming Pro?

I know the PCB isn't that great so I wouldn't want to go crazy, but Palet's own 'Gamerock' cards seem to have a power limit of 400w rather than the 350w of the gaming pro. I have the card under an AIO, so I wouldn't mind trying a small boost in power limits...


----------



## Aurosonic

hey guys,

my first message in this thread. Would like to say huge thanks to all of you who shared their knowledge about Ampere shunting, overclocking, configuring and flashing here in this thread. 
Especially to *cstkl1*(for his shared experience), *dr.Rafi* (for his shunting experiments), *bmgjet *(for shunting info and Shuntmod Calculator), *PhoenixMDA *(for his tips and experience), *SoldierRBT *(for his golden chip and challenge ... Although i believe he will beat us all when get proper watercooled), *Falkentyne *(for his info about different methods of shunting), *olrdtg *(for his 3090 FE shunting thread) and other guys.

I've never made power mod before and was afraid to do it for the first time. Especially when there are no info or shunting experience for my card and it's cost. In early October i got Palit 3080 Gamerock OC which had 3x8 pin. But all i was able to get with 450W limit was 12600 points in Port Royal. Not so far i've moved to Gigabyte Aorus RTX3080 Xtreme Waterforce WB. Unfortunately it has only 2x8pin connectors and was limited with 370W factory bios which in practice had 350W limit only. That's a huge bottleneck for watercooled card with decent chip. And i have no logical explanation why Gigabyte did 3x8pins for their AIR cooled 3080 Xtreme and just 2 pins for Watercooled version. More of that they did the same to their 3090 Waterforce WB card which is absolutely criminal as to me  Anyway, i've tried all possible bioses 2 and 3 pin cards from other vendors with no luck. Everytime i've been limited with 350W and there was nothing left than Shuntin. After that I read all 148 pages of this thread and that info answered to pretty all the questions i had about VGA shunting (thankfuly to all the people mentioned above). So i've decided to shunt.

After removing the waterblock and backplate i've found that this card has same PCB design as Aorus Master except for chip capacitors which are 5SP+1MLCC insted of 6SP. And i've found 10A fuse on the PCI slot power line. So i've decided to go with 10mOhm resistors stacked ontop of 5mOhm which gives x1.5 power boost until 350W bios limit.

































Before making the shunt i was worried of my chip capabilities and decided to check it with the method provided by *PhoenixMDA.*
The result was 2295 droping to 2280 which means a decent chip i guess. And that results was exactly the same after heavy load and further tests after shunt.










Here's my maximum results before the shunt and at same clocks after the shunt for comparison:

















More to that i was happy to know that my memory perfectly scales and works at +1500 without perfomance drops or any issues.

And here's my final results i was able to get on my 525W powermode card (which never exceded 325w x 1.5 multiplier = 487.5w)

*Port Royal: 13518 *( https://www.3dmark.com/pr/681024 )









*Timespy: 20141* ( https://www.3dmark.com/spy/16633194 )









*Superposition 1080p Extreme: 13376*









*Superposition 8k Optimized: 7608*


----------



## Falkentyne

Nice results. That puts you above stock 3090 land.

Did you check the power balancing in gpu-z, with each wattage rail field set to maximum, in green, by clicking over it until it changes to max?


----------



## obscurehifi

I've been running a bunch of tests lately on my AORUS Xtreme Waterforce AIB. I've previously mentioned the power seemed to be capped somewhere around ~350W, just like others have mentioned - most recently @Aurosonic in the post above.

What I found was that if I DON'T use an Undervolt/Overclocking method which locks the power down and doesn't seem to allow boosting (my theory even though I was targeting 350W-ish), I was able to get it to pull a little more power. I ended up with +1000 memory and +107 clock (offsetting factory curve). I just did this with the AORUS tool instead of MSI AB. They seem to yield the same results.

NOW, what I found was interesting is there seems to be a power reserve left for the LED's and the Fans. The LED's instantly change the power consumed by 3W, whether it's under load or idling. Then running the fans at MAX, they draw about 6 to 8W more than the auto fan curve. That's 9-11W in reserve for those two components. Adjusting them doesn't seem to affect the power drawn by the rest of the card. Now when I check the overall power consumed by turning the LED's on, maxing out the fan, and using an offset OC curve, I get right up next to 360 or 370 depending on the tool. The graphs below are with Open Hardware Monitor. So long story short - I think the card is actually using all of it's 370W, or at least most of it. It's probably holding a little in reserve for other processes, just like it does for the LED's and fans - or so is my guess. This is the chart from yesterday when I was doing that testing. If I recall, I think most of these were with Port Royal. You can see at 20:30, the power is at 350 where I locked the fan to it's lowest speed and turned LED's off. Then just after 20:10, I had the fans maxed, turned the LED's back on, and changed my OC method.









GPU-Z seems to report maximums that are about 10W higher than OH Monitor above. Which is correct?? Who knows. Above is with 1 second polling and below is with 1/2 second, so maybe that's more of the difference.










Another nice thing about running the fans at max with the AIB, is that the temps during benchmarks stay in the 40's. Running Time Spy Extreme, it maxes out at 44C. Here are four runs of TSE. I was testing the Ryzen PBO modes; from L3, L2, L1, and off. These were with the LED's off and fans maxed. These also used the same +1000mem and +107 clock.


----------



## Aurosonic

Falkentyne said:


> Nice results. That puts you above stock 3090 land.
> 
> Did you check the power balancing in gpu-z, with each wattage rail field set to maximum, in green, by clicking over it until it changes to max?


i think what you’re asking about is on the screenshot of Port Royal results. Maximum i saw was 325w board power, 212w gpu chip power (guess it’s short spike) , 60w mvddc, 60w pci slot power and 140-145w pci 8pin power both. Comparing to my stock results before shunting it’s almost exactly x1.5 multiplier.


----------



## Aurosonic

obscurehifi said:


> NOW, what I found was interesting is there seems to be a power reserve left for the LED's and the Fans.


That’s exactly my thoughts!


----------



## outofmyheadyo

Aurosonic said:


> hey guys,
> 
> my first message in this thread. Would like to say huge thanks to all of you who shared their knowledge about Ampere shunting, overclocking, configuring and flashing here in this thread.
> Especially to *cstkl1*(for his shared experience), *dr.Rafi* (for his shunting experiments), *bmgjet *(for shunting info and Shuntmod Calculator), *PhoenixMDA *(for his tips and experience), *SoldierRBT *(for his golden chip and challenge ... Although i believe he will beat us all when get proper watercooled), *Falkentyne *(for his info about different methods of shunting), *olrdtg *(for his 3090 FE shunting thread) and other guys.
> 
> I've never made power mod before and was afraid to do it for the first time. Especially when there are no info or shunting experience for my card and it's cost. In early October i got Palit 3080 Gamerock OC which had 3x8 pin. But all i was able to get with 450W limit was 12600 points in Port Royal. Not so far i've moved to Gigabyte Aorus RTX3080 Xtreme Waterforce WB. Unfortunately it has only 2x8pin connectors and was limited with 370W factory bios which in practice had 350W limit only. That's a huge bottleneck for watercooled card with decent chip. And i have no logical explanation why Gigabyte did 3x8pins for their AIR cooled 3080 Xtreme and just 2 pins for Watercooled version. More of that they did the same to their 3090 Waterforce WB card which is absolutely criminal as to me  Anyway, i've tried all possible bioses 2 and 3 pin cards from other vendors with no luck. Everytime i've been limited with 350W and there was nothing left than Shuntin. After that I read all 148 pages of this thread and that info answered to pretty all the questions i had about VGA shunting (thankfuly to all the people mentioned above). So i've decided to shunt.
> 
> After removing the waterblock and backplate i've found that this card has same PCB design as Aorus Master except for chip capacitors which are 5SP+1MLCC insted of 6SP. And i've found 10A fuse on the PCI slot power line. So i've decided to go with 10mOhm resistors stacked ontop of 5mOhm which gives x1.5 power boost until 350W bios limit.
> 
> View attachment 2470918
> 
> 
> View attachment 2470915
> View attachment 2470916
> View attachment 2470917
> 
> 
> Before making the shunt i was worried of my chip capabilities and decided to check it with the method provided by *PhoenixMDA.*
> The result was 2295 droping to 2280 which means a decent chip i guess. And that results was exactly the same after heavy load and further tests after shunt.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's my maximum results before the shunt and at same clocks after the shunt for comparison:
> 
> View attachment 2470919
> View attachment 2470920
> 
> 
> More to that i was happy to know that my memory perfectly scales and works at +1500 without perfomance drops or any issues.
> 
> And here's my final results i was able to get on my 525W powermode card (which never exceded 325w x 1.5 multiplier = 487.5w)
> 
> *Port Royal: 13518 *( https://www.3dmark.com/pr/681024 )
> View attachment 2470921
> 
> 
> *Timespy: 20141* ( https://www.3dmark.com/spy/16633194 )
> View attachment 2470922
> 
> 
> *Superposition 1080p Extreme: 13376*
> View attachment 2470923
> 
> 
> *Superposition 8k Optimized: 7608*
> View attachment 2470924


So there is no point in trying the bios of the 3080 xtreme ( aircooled version ) on the 3080 xtreme WB ? The aircooled version seems to have 450W limit vs 370 on the WB version, im pretty dissapointed by the 3080 WB


----------



## ScorpMCP

outofmyheadyo said:


> So there is no point in trying the bios of the 3080 xtreme ( aircooled version ) on the 3080 xtreme WB ? The aircooled version seems to have 450W limit vs 370 on the WB version, im pretty dissapointed by the 3080 WB


the watercooled one only got 2x8pin right? no point using a 3x8pin bios then i think.


----------



## PhoenixMDA

@Aurosonic
Very nice Work, you have really perfect soldered, your Point´s are also awesome.
5,55Ghz are very good, 5,5ghz i have tested with my Chip that works but not more tested.My Chip is also really nice.
Uncappered best VCore Run in Cold by 5,[email protected],137V.









I think if Soldier have his Waterblock for his Gold Chip, he will bet my 20806 Grafik Points, this is the Limit of my Card with my 25% more Power.
I must say the Frametime with the 10900K is better than with my 9900k and the 9900k was nearly perfekt.

i´m really satisfied with my System.


----------



## Aurosonic

PhoenixMDA said:


> I think if Soldier have his Waterblock for his Gold Chip, he will bet my 20806 Grafik Points,


Thanks! Any tips/tweaks to make Timespy GPU Score a little better ?


----------



## PhoenixMDA

I have set in NVidia driver AF to off, prerending virtual reality frames to 4, and texture quality to high performance that´s all.
Perhap´s it give more opportunities, but that is more cheat and that are no truth Point´s.

In 24/7 i have a little bit over 20k Grafikscore with newest driver and standard driver setting (quality), i think that´s very good.

With more RamOC 4600CL16-16 it´s also posible to reach up to 17,5k CPU [email protected],5Ghz, that has snakeeyes tested from HWL.
But by such benches dont forget you can crash your Windows and must reinstall, that´s the reason why i let it.^^
In 24/7 i drive [email protected],57V.


----------



## acrvr

Posting about my experience with AIO cooling on the 3080 Strix. Also my friend with 3080 FTW3 has the exact same experience. This was using a custom 3d printed aio asetek bracket, since drilling the G12 kraken puts the cold plate off center from the GPU.

Temps are good, using 280mm AIO good mount. 41-42C temp on load, 25C idle. After just a few short minutes gaming, or running Heaven 4.0, even at the temps mentioned, GPU would heavily downclock to sub 1800mhz and 0.8v. No voltage locking on MSI AB can overrule this. This is the exact same behavior on two different brands, as I mentioned, Strix and FTW3, same temps. I have 3 120mm fans blowing to the gpu chip side to cool ram and VRM. And the stock backplate is still mounted. On stock cooling my STRIX has no problem holding 2000mhz while gaming, temps 70+ at 440W draw.

We are guessing there is a hidden temp sensor somewhere on the Ampere cards. Curious to see if anyone else have had success with AIOs on Amperes.


----------



## FeelsBadMan

This behaviour sounds like vram reaching high temps and throttling the card. From my own testing on vram intensive tasks (not gaming) with a 3080 Vision with the stock cooler the card starts to throttle even though the die temp is 50-58C and you need to ramp up the fans to start overcoming it. In gpuz/hwinfo/afterburner it even reports temp limit as perf cap reason while having the core temps mentioned above. So nvidia has a way to get vram temps with their drivers but we don't have access to it. Never encountered this in benchmarks or gaming though since vram isn't used that heavily and the vram contact with the coldplate is enough. The Strix has a dedicated vram mini-heatsink if I'm not mistaken, try mounting it, blast it with a fan and see if it improves. GDDR6X runs a lot hotter than GDDR6 by design too.


----------



## ssgwright

I NEED HELP! So I just got in the silver paint to cold solder the shunts. I've been running liquid metal and I knew I had to get it off before my shunts fell off. Well, as I was cleaning them the two main shunts by the power delivery of course came off... So, first I tried soldering them back on but for the life of me I couldn't get the solder to stick. I could get it to stick to the card and the shunts separately but when I tried to combine the two they just wouldn't stick... so then I tried to use hot glue, I put a little liquid metal on all the connection points of the shunts to ensure they would conduct and hot glued them. This worked however, when I booted and tried kombuster my card was gimped... wouldn't go above 1600mhz... so then I took off the hot glued shunts and used just liquid metal between the shunt connections but it's still doing the same thing? Anyone know what's going on?


----------



## acrvr

FeelsBadMan said:


> This behaviour sounds like vram reaching high temps and throttling the card. From my own testing on vram intensive tasks (not gaming) with a 3080 Vision with the stock cooler the card starts to throttle even though the die temp is 50-58C and you need to ramp up the fans to start overcoming it. In gpuz/hwinfo/afterburner it even reports temp limit as perf cap reason while having the core temps mentioned above. So nvidia has a way to get vram temps with their drivers but we don't have access to it. Never encountered this in benchmarks or gaming though since vram isn't used that heavily and the vram contact with the coldplate is enough. The Strix has a dedicated vram mini-heatsink if I'm not mistaken, try mounting it, blast it with a fan and see if it improves. GDDR6X runs a lot hotter than GDDR6 by design too.


Thanks but the Strix doesn't have the mini heatsinks. Shame there is no way to check VRAM temps. I tried running the memory on stock as well and it still downclocks. 

Overall I'm not too bothered by it. The performance out of the factory is great and I'm happy with the frames I'm getting. 

I just want to share this for others who may want to consider an AIO for their cards.


----------



## PhoenixMDA

ssgwright said:


> I NEED HELP! So I just got in the silver paint to cold solder the shunts. I've been running liquid metal and I knew I had to get it off before my shunts fell off. Well, as I was cleaning them the two main shunts by the power delivery of course came off... So, first I tried soldering them back on but for the life of me I couldn't get the solder to stick. I could get it to stick to the card and the shunts separately but when I tried to combine the two they just wouldn't stick... so then I tried to use hot glue, I put a little liquid metal on all the connection points of the shunts to ensure they would conduct and hot glued them. This worked however, when I booted and tried kombuster my card was gimped... wouldn't go above 1600mhz... so then I took off the hot glued shunts and used just liquid metal between the shunt connections but it's still doing the same thing? Anyone know what's going on?


You need flux soldering like this, don´t try to solering without anymore.
https://www.mouser.de/ProductDetail/?MG-Chemicals/8341-10ML/&qs=/ha2pyFaduhmRpj8HlJRvoERqexF%2biGDQDC9d3Z%2byP4=

You must clean up the solder joints then use the flux solering and use also solder with flux by soldering.
At first the solder joints without shunts, try to fill up with solder.

If you are finished don´t forget to clean up the soldering joints.

@ssgwright
P.s. if you cant selbst soldering, let it do from a person with Experience.


----------



## dev1ance

Pedros said:


> It's been a pattern, I never seem to get a good sample of a gpu with Nvidia


The bin lottery is just too real with the core but also mem. My Suprim does +120 on core but mem is a meager +800 while everyone seems to be +1000 or more. My Strix on the other hand does +135 on core and mem + 1300 but my friend's Strix can only do +90 and +700 on mem (anything more and error correction kicks in hard for him). My Suprim does better with undervolts (can hold 2115-2130 at 1.05v while my Strix needs 1.081v for the same clocks). Suprim > Strix when it comes to cooling for me. I spent hours playing Cyberpunk with fans spinning at 2000 RPM and temps fluctuated between 58-65 on the Suprim but 2000RPM on the Strix has temps between 64-71 for me with ambient at 26.


----------



## Aurosonic

dev1ance said:


> The bin lottery is just too real with the core but also mem. My Suprim does +120 on core but mem is a meager +800 while everyone seems to be +1000 or more


It's a lottery for sure. I had 3 card before: 3080 Strix (max mem was +800), 3080 FTW3 (max mem was +1250), Palit 3080 Gamerock OC (max mem +700) and now Aorus Xtreme Waterforce (mem +1500).


PhoenixMDA said:


> 5,55Ghz are very good, 5,5ghz i have tested with my Chip that works but not more tested.My Chip is also really nice.
> Uncappered best VCore Run in Cold by 5,[email protected],137V.


My chip wasn't able to pass Linx higher than 5200/1.38v, But after scalping it i'm able to pass Linx at 5.350/1.42v with over 700gflops and that's my daily config. Those 5555 needs 1.46v and water temps at about 12-15 Celsius. I' using it for benches only.








And here's my system. I made Cyberpunk 2077 theme colors for that time


----------



## StreaMRoLLeR

Soon I try this out. I also have 1000W XOC aswell. Lets hope chip is not dud


----------



## asdkj1740

Aurosonic said:


> hey guys,
> 
> my first message in this thread. Would like to say huge thanks to all of you who shared their knowledge about Ampere shunting, overclocking, configuring and flashing here in this thread.
> Especially to *cstkl1*(for his shared experience), *dr.Rafi* (for his shunting experiments), *bmgjet *(for shunting info and Shuntmod Calculator), *PhoenixMDA *(for his tips and experience), *SoldierRBT *(for his golden chip and challenge ... Although i believe he will beat us all when get proper watercooled), *Falkentyne *(for his info about different methods of shunting), *olrdtg *(for his 3090 FE shunting thread) and other guys.
> 
> I've never made power mod before and was afraid to do it for the first time. Especially when there are no info or shunting experience for my card and it's cost. In early October i got Palit 3080 Gamerock OC which had 3x8 pin. But all i was able to get with 450W limit was 12600 points in Port Royal. Not so far i've moved to Gigabyte Aorus RTX3080 Xtreme Waterforce WB. Unfortunately it has only 2x8pin connectors and was limited with 370W factory bios which in practice had 350W limit only. That's a huge bottleneck for watercooled card with decent chip. And i have no logical explanation why Gigabyte did 3x8pins for their AIR cooled 3080 Xtreme and just 2 pins for Watercooled version. More of that they did the same to their 3090 Waterforce WB card which is absolutely criminal as to me  Anyway, i've tried all possible bioses 2 and 3 pin cards from other vendors with no luck. Everytime i've been limited with 350W and there was nothing left than Shuntin. After that I read all 148 pages of this thread and that info answered to pretty all the questions i had about VGA shunting (thankfuly to all the people mentioned above). So i've decided to shunt.
> 
> After removing the waterblock and backplate i've found that this card has same PCB design as Aorus Master except for chip capacitors which are 5SP+1MLCC insted of 6SP. And i've found 10A fuse on the PCI slot power line. So i've decided to go with 10mOhm resistors stacked ontop of 5mOhm which gives x1.5 power boost until 350W bios limit.
> 
> View attachment 2470918


thanks for the picture dude. i didnt know xtreme wb having complete spcaps/poscaps instead of mixing them with solid caps just like the rest of the models except turbo which is currently exclusive to 3090.

what a joke. paying more than $900usd to get full spcaps/poscaps that are used on entry level 3090 turbo (only).


----------



## Tergon123

dev1ance said:


> The bin lottery is just too real with the core but also mem. My Suprim does +120 on core but mem is a meager +800 while everyone seems to be +1000 or more. My Strix on the other hand does +135 on core and mem + 1300 but my friend's Strix can only do +90 and +700 on mem (anything more and error correction kicks in hard for him). My Suprim does better with undervolts (can hold 2115-2130 at 1.05v while my Strix needs 1.081v for the same clocks). Suprim > Strix when it comes to cooling for me. I spent hours playing Cyberpunk with fans spinning at 2000 RPM and temps fluctuated between 58-65 on the Suprim but 2000RPM on the Strix has temps between 64-71 for me with ambient at 26.


Very similar findings with my Suprim as well. Easy 120 on core, memory does a bit better can pull +900. However not the best fan curves to conservative for my use, running CoD for a few hours and at or around 80 plus to hot, crank the fans to 70% 2200 plus and card runs in the 60 which is perfect, headphones and don't hear any of the noise. I have not taken the card apart at all, just enjoying it, it runs really well. I have messed with a few different bios's, strix was not good, it messes up the fans allot, the core runs up to +145 and memory went to +1000 with strix bios, had to manually set the fans with it, and the scores in the usual benches were no better even with the higher clocks, very odd. Time spy stock bios, overclocked to 122C, 900M gets 17,456, strix 145c, 1150M gets 17,034, explain that it makes no sense. So ended up flashing back to stock bios because of the fans.


----------



## joyzao

is it possible to use a bios from a 3x8pin board to a 2x8pin? I have rtx 3080 tuf oc


----------



## Aurosonic

joyzao said:


> is it possible to use a bios from a 3x8pin board to a 2x8pin? I have rtx 3080 tuf oc


You can use it for sure, but results will be worse


----------



## man from atlantis

I just tried Strix bios on my GameRock out of curiosity. The card already has 440W power limit by default. TL/DR It was a mess.
Fans start spinning at 50% and they can still reach 3000rpm at 100% which was cool at start. Because that means I could use afterburner custom fan curve and have zero fan option at the same. What could have gone wrong.
Tried kombustor to see if 450W power limit works, nope it didn't. For some reason card's locked itself to 360W GPU Power and base 1440MHz clock speed. Probably some fail safe protection kicked in.
Power reports on HWinfo most likely was false, because at same GPU Power my total system consumption is 540W, while UPS was showing 420W with strix bios. 8-pin powers was all over; 90, 120, 150W respectively and MVDDC on GPU-Z was showing over 500W. Then I closed kombustor and reflashed my old bios.












http://imgur.com/a/rmMkZFi


----------



## obscurehifi

One of the features that narrowed my search of 3080's down to only two brands (Gibabyte and ASUS) was the additional monitor connections. The ASUS has 5, non-AORUS Gigabyte's has 5, and Gibabyte AORUS has 6. All other 3080's I have seen only have 4 (fine for most setups of course!). My one computer runs a triple monitor setup as well as a dual monitor setup on my desk, which gives me a minimum of 5 ports as a requirement.

Fortunately, I was able to snag an AORUS with the 6 ports and now have a few observations/results to share.

The triple monitors are Samsung 27" CRG5's (1080p 240hz), which is an equivalent of 3k at 240hz when combined. 240hz is offered on both HDMI and DP on this model.
Then on the single gaming monitor on my desk, it's a Samsung G7 (1440p 240hz). This one makes the configuration tough because 240hz is only available on DP.

What I found was that the triple monitor setup with NVIDIA Surround would only offer 240hz when using similar connectors, all HDMI or all DP. Since my G7 requires DP to do 240hz, that leaves HDMI for the triples. What was concerning is while there are three HDMI's available, one is 2.1 and the other is HDMI 2.0.

I'm happy to report that running the three 1080p monitors on the three HDMI's even though one is the slower 2.0 specification, it runs perfectly fine on my sim racing games running 5760x1080 @ 240hz! That is a relief because that leaves my G7 to be able to use a DP for 1440p 240hz gaming. Due to the G7 only being able to do 240hz on DP and NVIDIA Surround need a common connection type to run 240hz, that means the AORUS cards are the only 3080's that would work for me unless I traded my G7 for a different 1440p monitor that would offer 240hz on HDMI. The ASUS cards have 3 DP and 2 HDMI, so the G7's requirement of DP, means the ASUS wouldn't work in my setup. I'm just happy to have the first 3080 I was actually able to buy was an AORUS!

An important thing to note here for other triple monitor users, is that NVIDIA Surround would let me use a mixture of connection types but it would only leave 120hz as the maximum refresh rate available, so that may work for others just fine. Seeing how I just purchased all the 240hz monitors mentioned above, I wasn't going to settle for 120hz 

The four monitor limit still applies but I can run multiple configurations. Triple only, Triple with another single, Single on Desk, Double on Desk. One a side note, the 3080 doubled my FPS on the triple over my 1080 G1 Gaming, allowing me to run several racing games at 3k in the 180 to 240 FPS range and pretty high graphics settings. One happy gamer here!


----------



## Aurosonic

man from atlantis said:


> I just tried Strix bios on my GameRock out of curiosity.


My best success with this card was using FTW3 450W Bios. At same conditions power consuption of the card was 40-50W less, than with stock 440W bios.


----------



## ssgwright

ok I had a professional solder my shunts back on for me ($40 not bad). However, for some reason I'm hitting power limit at .8v at around 2000mhz? What the hell is going on with this thing?


----------



## Comalive

My Eagle OC has the fans spin up in idle periodically, much like this: https://preview.redd.it/ff1mwmze0mz...bp&s=5e4f83c2dfd26beae2fccad042d8bf28f7bfa079
What vbios would you recommend me to flash onto this card in order to fix this fan behaviour?


----------



## Falkentyne

ssgwright said:


> ok I had a professional solder my shunts back on for me ($40 not bad). However, for some reason I'm hitting power limit at .8v at around 2000mhz? What the hell is going on with this thing?


New shunts or the original shunts ? Are you shunt modded or 'back to stock'? If he modded shunts on to stack, or replaced the original ones with 3 mOhm, then something might be wrong.

You can easily hit power limit at 2000 mhz at 0.8v if you're not shunt modded. That's pretty normal.

Can you please post a benchmark run, and after it's done, post the GPU-Z sensors window? Please _remember_ to click "maximum" in each wattage value in gpu-z. (I keep having to remind people to do this, they keep forgetting).

Also bonus points, have hwinfo64 open at the same time and have the "TDP Normalized%" and TDP%" visible in the maximum column please.

Thank you.


----------



## ssgwright

Falkentyne said:


> New shunts or the original shunts ? Are you shunt modded or 'back to stock'? If he modded shunts on to stack, or replaced the original ones with 3 mOhm, then something might be wrong.
> 
> You can easily hit power limit at 2000 mhz at 0.8v if you're not shunt modded. That's pretty normal.
> 
> Can you please post a benchmark run, and after it's done, post the GPU-Z sensors window? Please _remember_ to click "maximum" in each wattage value in gpu-z. (I keep having to remind people to do this, they keep forgetting).
> 
> Also bonus points, have hwinfo64 open at the same time and have the "TDP Normalized%" and TDP%" visible in the maximum column please.
> 
> Thank you.


thanks for always helping people out here! It's acting normal.. gamed and i can run cold war at 2130 easy because it's not hitting power limit. I'm just used to the shunts being modded lol.

However, I tried that paint you recommended but it's not working? Tomorrow I'll maybe try and add more, we'll see.


----------



## Tergon123

ssgwright said:


> thanks for always helping people out here! It's acting normal.. gamed and i can run cold war at 2130 easy because it's not hitting power limit. I'm just not used to the shunts being modded lol.
> 
> However, I tried that paint you recommended but it's not working? Tomorrow I'll maybe try and add more, we'll see.


Try hot glue gun, I know quite a few people who use that trick for shunt's


----------



## ssgwright

I might try the glue gun trick think the wife has one for her hobbies

cards get pretty hot, the glue doesn't melt?


----------



## acoustic

Hybrid cooler on my FTW3 Ultra works like a charm. 12826 Port Royal score, beat my old top score.

I'm not looking for higher clocks, I just wanted quieter gaming. All my fans are @ 900-950rpm, and the 3080 was obviously much louder than anything else. I slapped 4 ML120 Pros on the 240mm rad (will grab Noctua A12x25s once they're back in stock and not price-gouged for $50 per fan) and with them at 1100rpm, it's working great. 50-52c solid in CP'77, card just sits at 1.081v.

Perfect and quiet!


----------



## PhoenixMDA

ssgwright said:


> I might try the glue gun trick think the wife has one for her hobbies
> 
> cards get pretty hot, the glue doesn't melt?


Let it..soldering is the best and safe way.
If you want to get more Power let it soldering, for 24/7 20mOhm is enough and this is safe. (all 6 shunt´s)


----------



## Youngtimer

Finally joining the club with a MSI Suprim X. It was a really long searching time, because the availability in Germany is still not good. Nevertheless, I could collect my Suprim just a day before Christmas and now I´m happily testing the card. It replaces a watercooled 1080ti which worked flawless for years at 2ghz and 1.000v.
The first impression are very good, build quality, the cooler, fan noise and performance is very impressive. But since I use a custom loop, the Suprim will be intigrated as soon as my waterblock arrives sometime in January. I decided to go for an Alphacool block and I´m really looking forward to it.




























Cyberpunk 2077 runs as good as I hoped and looks the bomb. With Raytracing ultra at 3440x1440 and ~60fps.








My weekly wow classic Raid works passive, with the fans standing still 









Everyday settings at the moment are 2010mhz at 0.950v and a beefy vram overclock of +1500.








Trying to find the maximum performance I got a 19k TimeSpy GPU Score with a curve at 2100mhz at 1.000v. But the boost will drop alot during the benchmark.








Link: https://www.3dmark.com/3dm/55525339?

So far I´m quiet happy, but maybe you have an idea what I could try to further improve my score?


Greetings from Germany
Youngtimer


----------



## SoldierRBT

@Youngtimer

Nice build! Congrats in your purchase. From your results, I’d say you already have nice scores. If you’d like to improve them, I’d suggest the following:

Lower GPU temps as much as possible to keep high OC
Flash a 450W BIOS on it to have more headroom. I believe Suprim BIOS is only 430W.
Make sure your memory OC isn’t decreasing your score. When memory is unstable, FPS drops. Test +1200, +1300 and +1400 and see which one is better
Do Nvidia panel tweaks


----------



## Youngtimer

@SoldierRBT : Thanks for your suggestions. The temps will drop when the waterblock arrives, but I will definately try to lower the memory overclock. Thanks!


----------



## Flisker_new

nyk20z3 said:


> People still Thirsty for a 3080 when the Ti is about to drop? Patience people before you regret it later.


It's not going to be like 1080 vs 1080Ti, even the difference between 3080 and 3090 is relatively small when talking about gaming so if 3080Ti is somewhere in the middle.. it seems kinda irrelevant at least to me.


----------



## Tergon123

ssgwright said:


> I might try the glue gun trick think the wife has one for her hobbies
> 
> cards get pretty hot, the glue doesn't melt?


Your card should not be getting that hot, put a fan on it.


----------



## Tergon123

Tergon123 said:


> Your card should not be getting that hot, out a fan on it.


Put


----------



## mtbiker033

how can you get a 3080 now? Just looking for a founders or a model I can put a water block on?


----------



## ScorpMCP

mtbiker033 said:


> how can you get a 3080 now? Just looking for a founders or a model I can put a water block on?


Not sure how it is in the states, but id check stores that dont have a queue system. I know amazon usually have some cards you can order a few days before they get them, l id check there, and hopefully you can snag one. Founders Edition might be harder to get ^^


----------



## ssgwright

well I tried the hot glue shunt mod on all shunts.. not working very well I do see a slight decrease in board power draw maybe? idk... I wish there was something conductive (besides liquid metal) you could use between shunts to ensure contact.


----------



## Falkentyne

ssgwright said:


> well I tried the hot glue shunt mod on all shunts.. not working very well I do see a slight decrease in board power draw maybe? idk... I wish there was something conductive (besides liquid metal) you could use between shunts to ensure contact.


What exactly are you doing? Are you stacking shunts? What are the original shunts mOhms and what are the shunts you are trying to stack on them?
Are the original shunts edges (silver edges) flush with the black housing or are they LOWER than the black housing?
Do you have any high res pictures of the ORIGINAL shunts (cleaned)?


----------



## ssgwright

Falkentyne said:


> What exactly are you doing? Are you stacking shunts? What are the original shunts mOhms and what are the shunts you are trying to stack on them?
> Are the original shunts edges (silver edges) flush with the black housing or are they LOWER than the black housing?
> Do you have any high res pictures of the ORIGINAL shunts (cleaned)?


original are 5m0hm and I stacked 5 m0hm on top, I put the shunt on top pressed hard and hot glued. Original shunts do have silver edges and they're pretty flush, and sorry I don't have any pictures.


----------



## Falkentyne

ssgwright said:


> original are 5m0hm and I stacked 5 m0hm on top, I put the shunt on top pressed hard and hot glued. Original shunts do have silver edges and they're pretty flush, and sorry I don't have any pictures.


You can't hot glue shunts on each other. Please UNSUBSCRIBE from that piece of crap FRAMECHASERS beginner mentally ill youtube link and NEVER WATCH anything that tool uploads ever again.

The 'hotglue' method (or liquid electrical tape) was only for securing shunts that had been fastened using conductive silver paint like MG 842AR, but we found that the tape/glue is unnecessary since MG 842AR creates a sufficient bond on its own.

So you should just solder the shunts properly (remember to use flux!!!) or use MG 824AR paint if you don't want to solder.


----------



## SPL Tech

Does anyone know if I can flash a different BIOS to my Evga XC3 Ultra 3080 to enable a higher power limit? Like, could I flash the FTW3 BIOS or a BIOS from another manufacturer?


----------



## ssgwright

Falkentyne said:


> You can't hot glue shunts on each other. Please UNSUBSCRIBE from that piece of crap FRAMECHASERS beginner mentally ill youtube link and NEVER WATCH anything that tool uploads ever again.
> 
> The 'hotglue' method (or liquid electrical tape) was only for securing shunts that had been fastened using conductive silver paint like MG 842AR, but we found that the tape/glue is unnecessary since MG 842AR creates a sufficient bond on its own.
> 
> So you should just solder the shunts properly (remember to use flux!!!) or use MG 824AR paint if you don't want to solder.


so put the paint down and the stack the shunt on the paint?


----------



## ssgwright

ok so did that and gpu-z reading a 280w board power draw but in PR it will only volt up to .75 at about 1800mhz??? So, I pulled all the shunts off and cleaned the board, same thing??!!!! max power pull is 280w and it says I'm hitting a power wall, what could be wrong with this card?


----------



## ssgwright

when I run Kombuster i see it's pulling 150w from pin #1 but only 100w from pin #2?


----------



## Falkentyne

ssgwright said:


> ok so did that and gpu-z reading a 280w board power draw but in PR it will only volt up to .75 at about 1800mhz??? So, I pulled all the shunts off and cleaned the board, same thing??!!!! max power pull is 280w and it says I'm hitting a power wall, what could be wrong with this card?


What exactly did you do?
What paint did you use ?
DID YOU SHAKE THE PAINT REPEATEDLY BEFORE OPENINIG IT? It has to be mixed VERY well before using !

(Note: if this is the MG 842AR jar version, do not keep the jar open, always close it instantly, because the fumes are toxic and the paint will dry out also, which you don't want).

Don't use the circuitwriter pen. It won't work.
Don't use random generic silver paint. It won't work.
The only paint that has been verified to work is MG 842AR-L (the pen version works as well but make sure you SHAKE the paint well).

If you are using MG 842AR silver paint, you need to SCRAPE THE EDGES of the shunts until they are a bright silver color, because you need to remove the conformal coating from the shunts.
I am not sure if the new shunts also have that coating, but you should scrape it to be sure. If you scrape it and nothing happens, then there's probably no coating on it. Conformal coating will make the edges of the shunt look more dull and less silverish/shiny'ish.

Also if the edges of the original shunts are in any way lower than the middle (even if its 0.2mm--it's still lower), you will need to use a thicker layer of paint to make sure both shunts' silver edges are bridged together. The shunts need to be "attached" to each other via the silver conductive edges, _NOT_ through the black middle housing! That is important.

So try scraping the silver edges of the shunts first. This will take awhile so DO IT SLOWLY and carefully. You do NOT want the screwdriver flat blade to 'slip' anywhere so go VERY slow and take your time. If you see the edges start becoming more shiny, that means you're removing the conformal coating.


----------



## ssgwright

card is bone stock now... max power it will draw is 300w... idk what the hell is going on


----------



## SPL Tech

Yall need to stop messing around with this BS shunt mod crap and just flash a real BIOS onto your card. Way safer and better.


----------



## bmgjet

SPL Tech said:


> Yall need to stop messing around with this BS shunt mod crap and just flash a real BIOS onto your card. Way safer and better.


Whats this real bios then?


----------



## ssgwright

I wish we had full control of the voltage like the good ol' days... why can't someone mod a damn bios for voltage control (remove power limit crap)


----------



## Pedros

Well... i got the worst of all bins then ... lolol ... anything above 104 and timespy will crash  Mem +632 and after that the error correction kicks in...

On Call of Duty Modern Warfare, for example, any mild OC will give a direct x error in the game :x


----------



## Nizzen

Pedros said:


> Well... i got the worst of all bins then ... lolol ... anything above 104 and timespy will crash  Mem +632 and after that the error correction kicks in...
> 
> On Call of Duty Modern Warfare, for example, any mild OC will give a direct x error in the game :x


+104 doesn't say anything about actual speed.
My 3090 strix is about 2100mhz with +100


----------



## Tergon123

Nizzen said:


> +104 doesn't say anything about actual speed.
> My 3090 strix is about 2100mhz with +100


That is true, on the MSI SuprimX, even plus 75 on the core can get you running 2100 in games like cod, very temperature dependent, try maxing the fans for benchmarks at least makes a big difference, also I have found in order to keep this card in the 60 degree range under load you need the fans at about 70%, this is 2100rpm or so which is badly audible and with good noise cancelling headphones in game you are good to go.


----------



## mtbiker033

ScorpMCP said:


> Not sure how it is in the states, but id check stores that dont have a queue system. I know amazon usually have some cards you can order a few days before they get them, l id check there, and hopefully you can snag one. Founders Edition might be harder to get ^^


thanks!


----------



## man from atlantis

scaling not bad


















Result







www.3dmark.com


----------



## SPL Tech

So where we at with BIOSes? I know people are doing BIOS flashing, so do we have a listing of what works for what card yet?


----------



## manolith

i got a 3080 MSI Ventus 3x oc a couple of days ago and its awesome. i undervolted to .906v and run 1980mhz core and 400 mem and temps are 68-70 on stock fan curve and you cant hear the thing. im sure i can push the memory more but haven't had time to play with that.


----------



## ssgwright

and we're back BABY! PR 12,795 http://www.3dmark.com/pr/702365

I broke down and paid a guy to solder all my shunts. Man, going from a shunt mod back to stock I thought I broke my card... it felt like it was gimped... so weak... but now! We're back!


----------



## Falkentyne

ssgwright said:


> and we're back BABY! PR 12,795 http://www.3dmark.com/pr/702365
> 
> I broke down and paid a guy to solder all my shunts. Man, going from a shunt mod back to stock I thought I broke my card... it felt like it was gimped... so weak... but now! We're back!


So now you're pulling more power than stock, finally?


----------



## ssgwright

yes finally... beat my personal best on the second run! PR 12,847 http://www.3dmark.com/pr/702417


----------



## Tergon123

MSI SuprimX 3080 This is with stock BIOS Overclocked Memory OC=20403.2 CoreOC= 2115. Fans were set to 100% in AB.


----------



## Tergon123

Tergon123 said:


> MSI SuprimX 3080 This is with stock BIOS Overclocked Memory OC=20403.2 CoreOC= 2115. Fans were set to 100% in AB.


----------



## ssgwright

Tergon123 said:


> View attachment 2471384


on air too, nice score


----------



## Biggd0gg

Tergon123 said:


> View attachment 2471384












You seem to have maxed out where I currently am with my Suprim as well, https://www.3dmark.com/pr/703028
2115 Core @1.068v, +1400 mem.


----------



## Tergon123

Biggd0gg said:


> View attachment 2471401
> 
> 
> You seem to have maxed out where I currently am with my Suprim as well, https://www.3dmark.com/pr/703028
> 2115 Core @1.068v, +1400 mem.


Your memory is running allot faster than mine was.


----------



## FeelsBadMan

https://www.3dmark.com/pr/701524 Found some time and tweaked the Vision using the stock 370W vbios, no mods, stock cooler. Not that bad I guess.


----------



## Tergon123

FeelsBadMan said:


> https://www.3dmark.com/pr/701524 Found some time and tweaked the Vision using the stock 370W vbios, no mods, stock cooler. Not that bad I guess.


Very nice score


----------



## r0l4n

SUPRIM X at +156 core, +1207 mem, maxed core voltage and PT sliders, 100% fans.

Port Royal: 12770


----------



## Tergon123

r0l4n said:


> SUPRIM X at +156 core, +1207 mem, maxed core voltage and PT sliders, 100% fans.
> 
> Port Royal: 12770


Was this water cooled, how did you max the core voltage? Anything more that 10% on air crashes the card most of the time?


----------



## r0l4n

Tergon123 said:


> Was this water cooled, how did you max the core voltage? Anything more that 10% on air crashes the card most of the time?


No, air cooled, the card is stock. I use MSI Afterburner (current beta, v4.6.3 Beta 2), and I just slide "Core Voltage %" all the way to 100%.


----------



## Tergon123

r0l4n said:


> No, air cooled, the card is stock. I use MSI Afterburner (current beta, v4.6.3 Beta 2), and I just slide "Core Voltage %" all the way to 100%.


Well tried that with the voltage slider, it didn't seem to do much, tried running the same setup my card simply will not work on those settings. Core tops out around +145, memory tops at +800, I also maxed the PL slider and temperature, also fans on 100%. but the numbers I got were really no better overall.


----------



## SirCanealot

acrvr said:


> Posting about my experience with AIO cooling on the 3080 Strix. Also my friend with 3080 FTW3 has the exact same experience. This was using a custom 3d printed aio asetek bracket, since drilling the G12 kraken puts the cold plate off center from the GPU.


Is there anywhere I can order something like this from? I currently have my 280mm AIO attached to my 3080 with cable ties, so it's not the best mount >_< (silent at around 48 degrees is a lot better than the stock cooler though!)
Was considering drilling my G12 bracket and seeing if that works any better, but don't have access to a good drill right now


----------



## Saizeo

Anybody got a fix for the xc3 power slider not working? I've got a hybrid and it's still power limited t 330 watts. should be able to go to 366 with 8% power max.


----------



## blackzaru

So... I had troubles getting my shuntmod to work on my Asus TUF. Desoldered everything, cleaned the board thoroughly, and soldered new 5 mOhm shunts on all 6 shunts of the card... Then put the stock (yes, the air cooler) on, and set 2190MHz at 1.08V, 90%TDP and 100% fan speed to see what would happen in a full load:










BOY OH BOY. 2190MHz achieved, on the first try, without doing anything else, on freaking air!!!! Only recorded for very very small intervals not to break anything... Now to the next step:










Full waterblock, with liquid metal, and conformal coating.

Let me tell you guys: My excitement is through the freaking roof. Pretty sure I'll be able to push it past 2.2GHz on water!!!


----------



## dev1ance

Biggd0gg said:


> View attachment 2471401
> 
> 
> You seem to have maxed out where I currently am with my Suprim as well, https://www.3dmark.com/pr/703028
> 2115 Core @1.068v, +1400 mem.


Interesting. My Suprim when I had it:
12827:https://www.3dmark.com/pr/613472
12794:https://www.3dmark.com/pr/613457 (notice how average clock freq is higher on this run but the overall score is still lower)
12788:https://www.3dmark.com/pr/613446 (same as above, high clock freq average but lower score)

Max on a Gaming X Trio using Strix BIOs (tried many runs but it just never did better than this and if anything, regressed permanently - to note, this is also before my upgrade to a 10900K):








I scored 12 617 in Port Royal


Intel Core i7-10700KF Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com


----------



## ssgwright

blackzaru said:


> So... I had troubles getting my shuntmod to work on my Asus TUF. Desoldered everything, cleaned the board thoroughly, and soldered new 5 mOhm shunts on all 6 shunts of the card... Then put the stock (yes, the air cooler) on, and set 2190MHz at 1.08V, 90%TDP and 100% fan speed to see what would happen in a full load:
> 
> View attachment 2471677
> 
> 
> BOY OH BOY. 2190MHz achieved, on the first try, without doing anything else, on freaking air!!!! Only recorded for very very small intervals not to break anything... Now to the next step:
> 
> View attachment 2471682
> 
> 
> Full waterblock, with liquid metal, and conformal coating.
> 
> Let me tell you guys: My excitement is through the freaking roof. Pretty sure I'll be able to push it past 2.2GHz on water!!!


nice I use that block with liquid metal as well.. most I can get stable (with shunts modded) is 2175 on port royal.. in game (Cold War) I run it at 2130.


----------



## xc3_320w

Hi Guys,

Noob here!

XC3 3080 Ultra, i've tried a few different BIOS's - ie Palit OC and TUF OC, but I am still running into what seems like the stock power limit (ie about 320w) - as reported by Afterburner...

GPU-Z says the right numbers (ie 340/375 for TUF OC), and i've cranked up the power limit slider in AB...

Whats the deal?

The only thing I didn't need todo was use "protect off" when flashing, but it flashed ok...


----------



## SPL Tech

xc3_320w said:


> Hi Guys,
> 
> Noob here!
> 
> XC3 3080 Ultra, i've tried a few different BIOS's - ie Palit OC and TUF OC, but I am still running into what seems like the stock power limit (ie about 320w) - as reported by Afterburner...
> 
> GPU-Z says the right numbers (ie 340/375 for TUF OC), and i've cranked up the power limit slider in AB...
> 
> Whats the deal?
> 
> The only thing I didn't need todo was use "protect off" when flashing, but it flashed ok...


The XC3 BIOS is a 366W power limit BIOS. The TUF is a 375W. There is no meaningful difference between the two. Also, the TUF has a lower clock rate setting than the XC3 BIOS. You probably would have been better off with the stock BIOS.


----------



## xc3_320w

Well I barely hit 330w atm... Which is what I'm trying to overcome.. I figured the XC3 bios was somehow ignoring the power limit "boost"... Hence why I tried a few others


----------



## ScorpMCP

How do you guys dial in memory overclock? I started in increments of 100Mhz, saw a gain of about 25points in port royal per 100mhz untill 900, then at 1000 i got about 4 points lower than 900. Is this the way to do it, or can someone elaborate?


----------



## MikeGR7

Guys, anyone tried to flash a 3 pin card with a 2 pin bios and have results??

Someone said the card may glich and tricked to draw more power?

I have MSI Trio and although i'm grateful for 450w strix bios, i think my card can push even more.

I'm watercooled btw.


----------



## obscurehifi

ScorpMCP said:


> How do you guys dial in memory overclock? I started in increments of 100Mhz, saw a gain of about 25points in port royal per 100mhz untill 900, then at 1000 i got about 4 points lower than 900. Is this the way to do it, or can someone elaborate?


I'm actually running my own testing on memory as I write this. I'm actually using Fire Strike Ultra because it includes CPU intensive tests, as well as combined tests. If I come up with something interesting, I'll post it. 

One thing I've read is that increasing GPU memory speed too far can decrease CPU speed. So my theory is that if I'm running games that are depending in the CPU speed, I'll be decreasing FPS. 

Sent from my SM-G973U using Tapatalk


----------



## SirCanealot

xc3_320w said:


> XC3 3080 Ultra, i've tried a few different BIOS's - ie Palit OC and TUF OC, but I am still running into what seems like the stock power limit (ie about 320w) - as reported by Afterburner...
> 
> GPU-Z says the right numbers (ie 340/375 for TUF OC), and i've cranked up the power limit slider in AB...
> 
> Whats the deal?


I have a Palit Gaming Pro 3080 (320w / 350w limit) and I thought I'd try flashing the Tuf OC bios bios (340w / 375w limit) and I don't think I've seen any changes to power limit either. It seems like discussion on this issue has been ongoing for ages with no resolution >_< 

My card mostly sits around 320w like everyone else's. It's not going to kill me, but my card is under water and I'd love to have another 50-ish watts to play with! 

Suppose I'll just flash back to the original bios then...>_<


----------



## Pedros

basically when you start your scores decrease, just go back to the last value where you got any gains from adding memory.

I had some weird behavior where between 630 and 750 I had no gains ... even lost but then after 900 I started getting gains and now I stopped at 1365. And these are reproducible results.


----------



## SPL Tech

xc3_320w said:


> Well I barely hit 330w atm... Which is what I'm trying to overcome.. I figured the XC3 bios was somehow ignoring the power limit "boost"... Hence why I tried a few others


What does GPU-Z report? That's your best resource for figuring out what's going on.


----------



## Aurosonic

ScorpMCP said:


> How do you guys dial in memory overclock? I started in increments of 100Mhz, saw a gain of about 25points in port royal per 100mhz untill 900, then at 1000 i got about 4 points lower than 900. Is this the way to do it, or can someone elaborate?


I’ve started with 0 memory:







then went straight to +1000:







then +100 till +1500 with points gain each step:


----------



## BluePaint

For vram I use paused Heaven benchmark in a window and AB. There u can see fps changes immediately after applying


----------



## obscurehifi

Here are my results of testing memory and clock speed overclocking for the day.... Whew! Time for a beer...

AORUS 3080 Xtreme Waterforce AIO (Stock 370W Bios) with Ryzen 3800x on air. I have the cpu with PBO enabled and set to Level 2.

I applied all the settings with MSI AB, using only Core Clock and Memory Clock Sliders. I did it this way because I have found if I try to use an undervolting/overclocking method, it can lead to instability due to limiting voltage. I now believe voltage is the key to stability (not crashing). +150, +1300 was the best I could do without crashing. My GPU core clock doesn't seem to bounce off the rev limiter, so to speak, when hitting the power limit. Perhaps that due to the low temperatures afforded by the AIO.

Here are the results. I like to use the conditional formatting in tables so you can see trends easily without plotting. I ran several settings multiple times and just plotting them as individual points rather than averaging each core clock speed. There are a few points where my processor doesn't seem to like the settings and can lead to variable results, such as at +130,+500. Blue fill are low scores for the columns and red fill is high. I added several columns to show actual and average clocks because offsets don't mean anything.

You can definitely see that the GPU overclocking hurts the CPU heavy tasks (Physics), so this may affect different games and benchmarks differently.




































After all that, I decided to do a scatter plot of memory clock speed versus Graphic Score. You can see the trends go upward with both core and memory clocks. Then of course maxing out my gpu and cpu fans increased the scores even more. I'd like to see how the entire graph scales with the 10 degrees lower temperature but nobody's got time for that! 









Here's the plot from all the runs showing gpu temp in green and gpu fan control in pink, so you can see where I maxed out the fans. The other lines are cpu related. The gpu maximum temps dropped from about 54 to 44 deg C. There's a lot of gains to be had at 44 degrees!









As for the best score, it puts me as #11 on the charts for 3080 and 3800x. Legendary status lol!








https://www.3dmark.com/fs/24506800

I had to see how this worked with Fire Strike Extreme (1440p) and low and behold, it puts my at #1 for 3080 and 3800x!








https://www.3dmark.com/3dm/55872430

Cheers

EDIT: Deleted some old thumbnails from the post that were old versions while I was doing the original draft.

EDIT2:

I noticed I averaged the wrong columns for the core clocks labels on my scatter plot graph. Those averages are a little misleading anyways since I believe 3DMark averages both graphic sessions and the 1st one was always limited and the second one was usually hitting near the actual clock setting, therefore, the first of the two lowers the average down. None of the points changed, just two labels. I also changed the x-axis to be the memory clock listed in the results rather than a scaled value I was using to try to get to the actual number and also deleted that column off the chart. Oops!
The other thing I'll add is that the +130 / +1300 setting that seems to work really well also gave me my highest Port Royal and Time Spy scores but instantly crashed in the Fire Strike 1080p test. That one runs fine when I drop it down to +100 / +1300. I find the core clock number is the one that causes the most crashes if it's set near the limit. There was something I found that crashed at +100 last week, so I since dropped it down to +50 / +1000 (auto fans too) for everyday long gaming sessions for stability.


----------



## SPL Tech

If everyone is getting +1500 on the memory, why the hell dident Nvidia or the aftermarket companies just clock the mem higher? Seems like a ton of wasted potential considering memory makes up 50% of FPS increase in overclocking.


----------



## WillP

Falkentyne said:


> You can't hot glue shunts on each other. Please UNSUBSCRIBE from that piece of crap FRAMECHASERS beginner mentally ill youtube link and NEVER WATCH anything that tool uploads ever again.
> 
> The 'hotglue' method (or liquid electrical tape) was only for securing shunts that had been fastened using conductive silver paint like MG 842AR, but we found that the tape/glue is unnecessary since MG 842AR creates a sufficient bond on its own.
> 
> So you should just solder the shunts properly (remember to use flux!!!) or use MG 824AR paint if you don't want to solder.


I've subscribed to him for the comedy value. The hot glue method didn't even work for him in the follow up video with his 3090. He's hilarious, a brilliant example of vanity, narcissism, and misplaced self-importance.


----------



## xc3_320w

SPL Tech said:


> What does GPU-Z report? That's your best resource for figuring out what's going on.


It reports (Under Advanced -> NVIDIA BIOS -> Power Limit):
Default: 340w
Max: 375w

However my card will not go above 331w (according to HWINFO and MSI AB) regardless of the Power Limit slider in AB...


----------



## man from atlantis

SPL Tech said:


> If everyone is getting +1500 on the memory, why the hell dident Nvidia or the aftermarket companies just clock the mem higher? Seems like a ton of wasted potential considering memory makes up 50% of FPS increase in overclocking.


Even though in port royal, time spy extreme at +1500MHz my scores increase, i noticed it leads to crash in games under reasonable fan curves. At 100% fan speed it doesn't crash. I settled at +1000MHz atm.


----------



## Rawfodog

Aurosonic said:


> It's a lottery for sure. I had 3 card before: 3080 Strix (max mem was +800), 3080 FTW3 (max mem was +1250), Palit 3080 Gamerock OC (max mem +700) and now Aorus Xtreme Waterforce (mem +1500).
> 
> My chip wasn't able to pass Linx higher than 5200/1.38v, But after scalping it i'm able to pass Linx at 5.350/1.42v with over 700gflops and that's my daily config. Those 5555 needs 1.46v and water temps at about 12-15 Celsius. I' using it for benches only.
> View attachment 2471043
> 
> And here's my system. I made Cyberpunk 2077 theme colors for that time
> View attachment 2471044
> View attachment 2471045





Aurosonic said:


> It's a lottery for sure. I had 3 card before: 3080 Strix (max mem was +800), 3080 FTW3 (max mem was +1250), Palit 3080 Gamerock OC (max mem +700) and now Aorus Xtreme Waterforce (mem +1500).
> 
> My chip wasn't able to pass Linx higher than 5200/1.38v, But after scalping it i'm able to pass Linx at 5.350/1.42v with over 700gflops and that's my daily config. Those 5555 needs 1.46v and water temps at about 12-15 Celsius. I' using it for benches only.
> View attachment 2471043
> 
> And here's my system. I made Cyberpunk 2077 theme colors for that time
> View attachment 2471044
> View attachment 2471045


I have both the aircooled aorus xtreme and the waterforce wb, I haven't had to mess with them because I've had to work a lot lately. I wanted to ask if you think it's worth it to shunt mod the waterforce for anything other than benchmarks scores. As I could just go with 450w out the box with the aircooled one


----------



## ssgwright

ok guys don't buy the EK 
* QuantumX Delta TEC Water Block*

works great for a modest overclock... but anything serious, say a 10850k at 5.0 trying to run cinebench will overload the TEC... damn... had such high hopes for this thing. gaming it will run 5.1 keeping temps at 50c and 20c with no load but if you run a benchmark or stress test it proves too much for the peltier.


----------



## Nizzen

ssgwright said:


> ok guys don't buy the EK
> * QuantumX Delta TEC Water Block*
> 
> works great for a modest overclock... but anything serious, say a 10850k at 5.0 trying to run cinebench will overload the TEC... damn... had such high hopes for this thing. gaming it will run 5.1 keeping temps at 50c and 20c with no load but if you run a benchmark or stress test it proves too much for the peltier.


This is not news 

Running direct die cooled 10900k, and temps is under 50c in most games


----------



## Aurosonic

Rawfodog said:


> I have both the aircooled aorus xtreme and the waterforce wb, I haven't had to mess with them because I've had to work a lot lately. I wanted to ask if you think it's worth it to shunt mod the waterforce for anything other than benchmarks scores. As I could just go with 450w out the box with the aircooled one


It's definitely worth it. Without shunting i wasn't able to play Cyberpunk even at 2100/0.950 cuz it drops frequencies to 2025-2010 due to 350W limit. After the shunt i'm able to play Cyberpunk at 2175/1.068 steady without any gpu frequency drops. Same for other games. Metro - Exodus with RTX on, DLSS off is the most demanding game i've met so far. It easily eats 350w even at 1935/0.862v. After the shunt i was able to run it at 2100/1.006v without frequency drops.


----------



## MikeGR7

ssgwright said:


> ok guys don't buy the EK
> * QuantumX Delta TEC Water Block*
> 
> works great for a modest overclock... but anything serious, say a 10850k at 5.0 trying to run cinebench will overload the TEC... damn... had such high hopes for this thing. gaming it will run 5.1 keeping temps at 50c and 20c with no load but if you run a benchmark or stress test it proves too much for the peltier.


Something is not right with your setup man, it has been demonstrated to hit 5.6 under load no problem.
What is your watercooling setup for managing the TEC??


----------



## MikeGR7

Nizzen said:


> This is not news
> 
> Running direct die cooled 10900k, and temps is under 50c in most games


I am interested in your setup 😋 

Can you please give me more details how to direct die cool a 10900K??

Thank you


----------



## MikeGR7

MikeGR7 said:


> Guys, anyone tried to flash a 3 pin card with a 2 pin bios and have results??
> 
> Someone said the card may glich and tricked to draw more power?
> 
> I have MSI Trio and although i'm grateful for 450w strix bios, i think my card can push even more.
> 
> I'm watercooled btw.


Soooo anyone???


----------



## ssgwright

MikeGR7 said:


> Something is not right with your setup man, it has been demonstrated to hit 5.6 under load no problem.
> What is your watercooling setup for managing the TEC??


it was not demonstrated, I watched d8euar or whatever his name is from Germany and he could run only 2 cores at 5.8 but he never really showed any benchmarks or stress tests.. well he did but only briefly, I can run cinebench at 5.2 but the temps slowly creep up, then right towards the end of the first run it'll reach over 100c. The heat just overwhelms the peltier... I have a 480 and a 360 rad in my loop. Before, I was stable in cinebench at 5.2 with avx offset at 0 temps barely hit 85c with my old block.


----------



## acoustic

ssgwright said:


> it was not demonstrated, I watched d8euar or whatever his name is from Germany and he could run only 2 cores at 5.8 but he never really showed any benchmarks or stress tests.. well he did but only briefly, I can run cinebench at 5.2 but the temps slowly creep up, then right towards the end of the first run it'll reach over 100c. The heat just overwhelms the peltier... I have a 480 and a 360 rad in my loop. Before, I was stable in cinebench at 5.2 with avx offset at 0 temps barely hit 85c with my old block.


D8rbauer does do a video with the TEK and shows it struggles at high power-draws on a Ryzen. He was seeing instability with Cinebench and other workloads (especially AVX), thought it did keep things cool during gaming. It sounds like a great idea, but considering the power-draw of the unit itself, I'm not very impressed.


----------



## MikeGR7

acoustic said:


> D8rbauer does do a video with the TEK and shows it struggles at high power-draws on a Ryzen. He was seeing instability with Cinebench and other workloads (especially AVX), thought it did keep things cool during gaming. It sounds like a great idea, but considering the power-draw of the unit itself, I'm not very impressed.


Well it is not designed for Ryzen so a meh result seems appropriate.
On the contrary, meh result on Intel is really inappropriate because of the 350+ euro price...



ssgwright said:


> it was not demonstrated, I watched d8euar or whatever his name is from Germany and he could run only 2 cores at 5.8 but he never really showed any benchmarks or stress tests.. well he did but only briefly, I can run cinebench at 5.2 but the temps slowly creep up, then right towards the end of the first run it'll reach over 100c. The heat just overwhelms the peltier... I have a 480 and a 360 rad in my loop. Before, I was stable in cinebench at 5.2 with avx offset at 0 temps barely hit 85c with my old block.


I have to agree with you this time, i rewatched his video and the guy literally skipped the multi threaded run of Cinebench...
Smart thing he did there, first run the single threaded test showing all cores at 5.6 and temps @40s and then fast skip the stronger test leaving us with the impression that it performed similarly....
I am genuinely disappointed because i was interested in this product.
I thank you for sharing your results.


----------



## ssgwright

acoustic said:


> D8rbauer does do a video with the TEK and shows it struggles at high power-draws on a Ryzen. He was seeing instability with Cinebench and other workloads (especially AVX), thought it did keep things cool during gaming. It sounds like a great idea, but considering the power-draw of the unit itself, I'm not very impressed.


ya it doesn't even compare to my block at high workloads, but you can game at crazy fast mhz... I guess if you set it up right to boost to a certain mhz in games and lower mhz for high workloads it could be worth it for the extra fps. It was fun to play with, gonna sell for $300 if anyone wants it *update, it sold*


----------



## SPL Tech

ssgwright said:


> ya it doesn't even compare to my block at high workloads, but you can game at crazy fast mhz... I guess if you set it up right to boost to a certain mhz in games and lower mhz for high workloads it could be worth it for the extra fps. It was fun to play with, gonna sell for $300 if anyone wants it *update, it sold*


Hey, you take your dirty CPU moding mouth out of here with that. This is for GPUs, not CPUs. CPUs are lame. GPUs are what the people want.


----------



## CantingSoup

Anyone have experience with the ELSA 3080 ERAZOR X Cold War Edition? Is it a reference card?


----------



## ssgwright

CantingSoup said:


> Anyone have experience with the ELSA 3080 ERAZOR X Cold War Edition? Is it a reference card?


never even hear of that card


----------



## CantingSoup

ssgwright said:


> never even hear of that card


It’s a Japanese brand: Amazon.co.jp: ELSA GeForce RTX 3080 ERAZOR X Graphics Board GD3080-10GEREZX VD7432: Computers & Peripherals


----------



## ssgwright

CantingSoup said:


> It’s a Japanese brand: Amazon.co.jp: ELSA GeForce RTX 3080 ERAZOR X Graphics Board GD3080-10GEREZX VD7432: Computers & Peripherals


looks like a 3 pin card? if it is you should flash the bios to strix or ftw3


----------



## CantingSoup

ssgwright said:


> looks like a 3 pin card? if it is you should flash the bios to strix or ftw3


It’s a 2 pin. I haven’t actually bought the card yet. I’m trying to see if anyone has a picture of the PCB.


----------



## monitorhero

Hey guys,

I have a question. In MSI Afterburner you have a Frequency Curve. Is it different on every model or should the max clocks be the same on one brand of cards?
I have an ASUS TUF 3080 OC and my frequency curve maxes out at 1950mhz. Another guy's frequency curve goes to 2010mhz with same Bios and same card. Is this an indication if you got a good or bad chip?


----------



## ZAlien

Hi guys. What would be yours best 3080 recommendation? I'm currently aiming for Strix OC (waiting for it to show up at stores) mainly because of quality of the components used on that card. But is that the best choice, is there something better I'm missing?


----------



## ssgwright

monitorhero said:


> Hey guys,
> 
> I have a question. In MSI Afterburner you have a Frequency Curve. Is it different on every model or should the max clocks be the same on one brand of cards?
> I have an ASUS TUF 3080 OC and my frequency curve maxes out at 1950mhz. Another guy's frequency curve goes to 2010mhz with same Bios and same card. Is this an indication if you got a good or bad chip?


no that has nothing to do with it... if they are the same card they both should have the same curve. If you add to the core clock the whole curve goes up across the board. What really matters is what mhz you can get stable while gaming


----------



## monitorhero

ssgwright said:


> no that has nothing to do with it... if they are the same card they both should have the same curve. If you add to the core clock the whole curve goes up across the board. What really matters is what mhz you can get stable while gaming


Both cards run at stock settings. It is really weird. I also flashed from Bios 94.02.26.48.40 to x.x.x.x.65 but the card got hotter. So I used my saved file and flashed it back to the original bios.
But now I get two different readings in CPU-Z

Original Bios








After Flashback to the same Bios Version. Notice how Pixel Fillrate and Texture Fillrate and GPU Clock changed to lower numbers? What is going on here?


----------



## Falkentyne

monitorhero said:


> Both cards run at stock settings. It is really weird. I also flashed from Bios 94.02.26.48.40 to x.x.x.x.65 but the card got hotter. So I used my saved file and flashed it back to the original bios.
> But now I get two different readings in CPU-Z
> 
> Original Bios
> View attachment 2472212
> 
> After Flashback to the same Bios Version. Notice how Pixel Fillrate and Texture Fillrate and GPU Clock changed to lower numbers? What is going on here?
> View attachment 2472213


You were overclocking when you took the screenshot on your original bios.
That's called an "Oops" moment.

Also the reason you ran hotter on the new vbios is because Asus increased some internal power limits for you. That's a good thing and you should probably use that vbios for more performance. A few C hotter isn't going to destroy your video card.


----------



## monitorhero

Ok so at 100% it has now more power going through the card? But when I had GPU-Z open and took the screenshot everything was at stock. But ok, good to know.
Since it is a Dual Bios and the Asus .exe for V2 is just one file. If I am in OC bios it will flash the OC bios and if I am in Silent bios it will flash the new Silent bios? Or is just one bios for OC mode?


----------



## Falkentyne

monitorhero said:


> Ok so at 100% it has now more power going through the card? But when I had GPU-Z open and took the screenshot everything was at stock. But ok, good to know.
> Since it is a Dual Bios and the Asus .exe for V2 is just one file. If I am in OC bios it will flash the OC bios and if I am in Silent bios it will flash the new Silent bios? Or is just one bios for OC mode?


You can turn off the computer, switch the bios switch to the other position (the one you want the new bios on), power on, flash the new bios on that bios slot then use it, then if you want to use the original bios, just power off and then flip the switch back. It's really not hard.


----------



## Colonel_Klinck

ssgwright said:


> oh wow there is a new TUF bios out


Can I ask what bios version installed after you used the ASUS RTX3080_V2.exe updater? If that is how you did update.

Mine is now 94.02.42.40.66 which I can't find on the techpowerup bios page 








TechPowerUp


Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.




www.techpowerup.com


----------



## ssgwright

when I posted that is when the v2 bios you already have came out


----------



## SPL Tech

ZAlien said:


> Hi guys. What would be yours best 3080 recommendation? I'm currently aiming for Strix OC (waiting for it to show up at stores) mainly because of quality of the components used on that card. But is that the best choice, is there something better I'm missing?


Whatever you can get. Cards are sold out everywhere. You're going to be lucky if you can get any card anywhere other than eBay for scalpter prices.


----------



## monitorhero

Falkentyne said:


> You can turn off the computer, switch the bios switch to the other position (the one you want the new bios on), power on, flash the new bios on that bios slot then use it, then if you want to use the original bios, just power off and then flip the switch back. It's really not hard.


I understand the process. I just wondered if that .exe has two different Bios included (silent and oc). If I run that exe on the silent and oc bios. Will I have the same version on both bios or something like 94.02.42.40.66 and 94.02.42.40.65 for either one?

Also here is my frequency curve on the TUF 3080 OC








And here is the other guys TUF 3080 OC Frequency Curve:








You see he did not move any points up. It's his stock curve while mine is much lower.


----------



## bmgjet

Asus has a few bios packed into there update. It looks at current version on the list that embeded then selects the next version up from that. So will do a different bios per switch position.
EVGA does a I2C check to see what the switch position is then selects theres based on that.

You can Open Asus updater in winrar and see all its contents.


----------



## ssgwright

wow, there's 11 different bios' in the asus update... interesting


----------



## bmgjet

ssgwright said:


> wow, there's 11 different bios' in the asus update... interesting


Look in the txt file and you can work out what ones upgrading to what.


----------



## Nizzen

ssgwright said:


> wow, there's 11 different bios' in the asus update... interesting


There is a few Asus models


----------



## monitorhero

bmgjet said:


> Asus has a few bios packed into there update. It looks at current version on the list that embeded then selects the next version up from that. So will do a different bios per switch position.
> EVGA does a I2C check to see what the switch position is then selects theres based on that.
> 
> You can Open Asus updater in winrar and see all its contents.


perfect. Thank you for explaining that. This was the answer I was looking for.


----------



## Hresna

Got a ROG STRIX 3080 about a week ago. Thought I would share a recent experience involving the fans failing allowing the card to hit 83 degrees at idle...

I’ve been running the card in the quiet bios for 0db fan support and testing under volt curves in afterburner for a few days. Normally when I’m done testing for a while I’ll let the fans sit at 30%, or close afterburner and let the card’s bios take over.

Last night I closed afterburner and my wife and I settled in to watch a movie (thankfully, in the same room as the PC). About a half hour into our movie, we were jarred to hear the fans go full leaf-blower for a few seconds. I ran over to the PC to find the card had idled its way all the way to its safety cutoff of 83 degrees.

I see there’s a new firmware from asus dated 12 Dec that is supposed to improve “0db compatibility” so I will flash that today and hopefully it fixes whatever bug/glitch I encountered. 
I’m glad the card had a safety in place to stop it idling into dangerous temps, but still not thrilled that the card was hitting its 83 degrees without my knowing!


----------



## Hresna

monitorhero said:


> I understand the process. I just wondered if that .exe has two different Bios included (silent and oc). If I run that exe on the silent and oc bios. Will I have the same version on both bios or something like 94.02.42.40.66 and 94.02.42.40.65 for either one?.


Also, forgive me if you already know this, but afterburner’s “reset” curve is temperature dependant. If you hit reset at 50C or more, you get a different curve than if resetting an ambient temp GPU. Best practice seems to be to do tests after running the fans long enough to lower temps and reset to the “real” stock setting.


----------



## BIaze

Does anyone here have the bios files for Colorful iGame RTX 3080 Vulcan *X *OC?

not the Vulcan OC NON X that's on techpowerup. This one has a 1905 boost and I'm curious to see what power limit it has


----------



## asdkj1740

wtffff gigabyte aorus master rev2.0 with 3*8pin.......................


----------



## monitorhero

Hresna said:


> Also, forgive me if you already know this, but afterburner’s “reset” curve is temperature dependant. If you hit reset at 50C or more, you get a different curve than if resetting an ambient temp GPU. Best practice seems to be to do tests after running the fans long enough to lower temps and reset to the “real” stock setting.


Oh ok. Interesting. My card however idles at 43°C. That's probably why the curve is different on my computer than the other guy I talked to. His idled at 29°C.


----------



## Colonel_Klinck

ssgwright said:


> when I posted that is when the v2 bios you already have came out


No my bios before was 94.02.26.80.EF. I was just curious if the bios you updated to was the same as loaded into mine now.

I haven't seen any increase in power. Still tops out at max 158w in GPU-Z.

edit: just seen the post about opening the ASUS exe file.


----------



## ssgwright

new port royale personal best: 12,865 http://www.3dmark.com/pr/731817

man, I wish we had full control over the voltage of these cards


----------



## monitorhero

Is your card watercooled? @ssgwright


----------



## Muqeshem

Hello. I have an rtx 3080 evga ftw3 ultra and it consume a lot of wattage. I did flash the oc bios with the 450watt bios but seriously why i can do 2100mhz with the 450watt bios but someone with a palit does it with 350watt ?! Any suggestions ftw3 owners?


----------



## Muqeshem

Also what is the minimum best graphic score for timespy (not extreme) for the evga ftw3 ultra model?


----------



## man1ac

So I could get my hands on a 3080 TUF OC - is there a way to get more Powerlimit? I wanna water it and could use more than 375W.
I read Gigabyte works?


----------



## xc3_320w

Muqeshem said:


> Hello. I have an rtx 3080 evga ftw3 ultra and it consume a lot of wattage. I did flash the oc bios with the 450watt bios but seriously why i can do 2100mhz with the 450watt bios but someone with a palit does it with 350watt ?! Any suggestions ftw3 owners?



Troll?
Don't run the XOC BIOS?
Lower your core voltage & clocks?
Get a S3 Trio?


----------



## man1ac

Thanks. Actually no and just have minutes to decide of I want the card. Hadnt had the time to get proper Information.


----------



## bmgjet

Muqeshem said:


> Hello. I have an rtx 3080 evga ftw3 ultra and it consume a lot of wattage. I did flash the oc bios with the 450watt bios but seriously why i can do 2100mhz with the 450watt bios but someone with a palit does it with 350watt ?! Any suggestions ftw3 owners?


They got more lucky on the silicon lottery and got a lower leakage chip.


----------



## xc3_320w

man1ac said:


> Thanks. Actually no and just have minutes to decide of I want the card. Hadnt had the time to get proper Information.


Sorry, I missed quoting the reply properly... Dangers of walking and replying..

AFAIK the TUF OC won't go over the power limit unless you shunt mod it...

In saying that, 375 is enough unless your trying to hit records.. in which case you probably won't care about modding it...


----------



## Muqeshem

xc3_320w said:


> Troll?
> Don't run the XOC BIOS?
> Lower your core voltage & clocks?
> Get a S3 Trio?


Excuse me ? What is wrong with you ?


----------



## Muqeshem

bmgjet said:


> They got more lucky on the silicon lottery and got a lower leakage chip.


I do think I can lower the wattage and have high freq because the card run warm and it is very annoying.


----------



## xc3_320w

Muqeshem said:


> Excuse me ? What is wrong with you ?


your complaining about power usage after flashing the XOC BIOS?

try undervolting...


----------



## Muqeshem

xc3_320w said:


> your complaining about power usage after flashing the XOC BIOS?
> 
> try undervolting...


Wait. I have rtx 3080 ftw3 ultra and not xoc. I flashed my card with the 450watt bios but even with the normal bioe (ftw3 has to bioses) the normal draws 400watt during games !


----------



## xc3_320w

the 450w BIOS IS the XOC BIOS (for the eVGA 3080 FTW3 Ultra)

regardless, the power limit is just that, an upper limit... ie what the card will try to stay below (if left unchecked).

if your not happy about wattage (or even if you are, but want a more efficient card), then undervolt... you dont need to use all of them 450 watts...


----------



## Muqeshem

xc3_320w said:


> the 450w BIOS IS the XOC BIOS (for the eVGA 3080 FTW3 Ultra)
> 
> regardless, the power limit is just that, an upper limit... ie what the card will try to stay below (if left unchecked).
> 
> if your not happy about wattage (or even if you are, but want a more efficient card), then undervolt... you dont need to use all of them 450 watts...


I did undervolt. I ran the card at 1018mv at 2115mhz but the card is running warm. Because it is using 450watt. I just do not know what to do. Liquid cooling is an option.


----------



## SanderH

Feels a bit daunting to even post in this thread as most people here are pretty hardcore , but here goes nothing.

My buddy recently received his RTX 3080 (Gigabyte Vision OC) and feels like the card somehow isn't running as well as expected, especially compared to his 2070 Super. Only modest FPS gains in both 1080p & 1440p. For reference, Time Spy score = 15.692 (GPU 17.525 / CPU 9.853).

Could any of the components be causing a bottleneck?


AMD Ryzen 7 3700X
MSI X470 Gaming Plus
32GB Corsair Vengeance LPX CMK32GX4M2D3000C16
Corsair RM750x 
Intel 660p 512GB

I'm starting to get a bit worried as I'm expecting to receive the exact same card very soon and my setup is pretty similar:

AMD Ryzen 7 3800X
ASUS ROG Strix B450-F Gaming
32GB Corsair Vengeance RGB Pro CMW32GX4M2Z3200C16
MSI MPG A750GF
Samsung 970 EVO 1TB (data)
Samsung 970 Evo Plus 250GB (OS)


----------



## xc3_320w

SanderH said:


> For reference, Time Spy score = 15.692 (GPU 17.525 / CPU 9.853).


Well my stock XC3 Ultra posted a 16950 GPU score in TimeSpy... under water and undervolted (0.875ish @ 1995), AND running the Palit BIOS, I get 18450 GPU score or thereabouts...

So I would say your friends score is about right...

Define modest FPS gains?

edit: I should say, I dont know if this is "good" or not 

Edit 2: I went back to a 1080TI (FTW) result, and it scored 10260 graphics score... So, yeh the 3080 is nearly 2x as fast... The 1080TI should be very similar to a 2070S... (Ex RTX).


----------



## josephimports

xc3_320w said:


> Well my stock XC3 Ultra posted a 16950 GPU score in TimeSpy... under water and undervolted (0.875ish @ 1995), AND running the Palit BIOS, I get 18450 GPU score or thereabouts...
> 
> So I would say your friends score is about right...
> 
> Define modest FPS gains?
> 
> edit: I should say, I dont know if this is "good" or not


How does the palit bios compare to the tuf oc?


----------



## xc3_320w

Not much in it.. but both are leagues ahead of the stock XC3 Ultra (or hybrid) BIOS...

I still don't understand why the power slider is doing nothing, nor why supposedly 340 BIOS's are still limited to 320 ish... Is there some other power limit in hardware?

Pretty annoying, I could use another couple of hundred MHz (on water), but the volts required just push the power limit way over...

I don't really wanna have to shunt mod the card...


----------



## Arni90

xc3_320w said:


> Not much in it.. but both are leagues ahead of the stock XC3 Ultra (or hybrid) BIOS...
> 
> I still don't understand why the power slider is doing nothing, nor why supposedly 340 BIOS's are still limited to 320 ish... Is there some other power limit in hardware?


I _think_ the cards have some phases hardwired to the PCIe-slot power, and the card is very close to the power draw limit at stock configuration, GPU-Z reported 66W power draw from the PCIe-slot pretty much constantly before I shunt-modded my card's PCIe-slot power limit. If this is true, then the only solution is to shunt-mod the PCIe shunt.


----------



## obscurehifi

SanderH said:


> Feels a bit daunting to even post in this thread as most people here are pretty hardcore , but here goes nothing.
> 
> My buddy recently received his RTX 3080 (Gigabyte Vision OC) and feels like the card somehow isn't running as well as expected, especially compared to his 2070 Super. Only modest FPS gains in both 1080p & 1440p. For reference, Time Spy score = 15.692 (GPU 17.525 / CPU 9.853).
> 
> Could any of the components be causing a bottleneck?
> 
> 
> AMD Ryzen 7 3700X
> MSI X470 Gaming Plus
> 32GB Corsair Vengeance LPX CMK32GX4M2D3000C16
> Corsair RM750x
> Intel 660p 512GB
> 
> I'm starting to get a bit worried as I'm expecting to receive the exact same card very soon and my setup is pretty similar:
> 
> AMD Ryzen 7 3800X
> ASUS ROG Strix B450-F Gaming
> 32GB Corsair Vengeance RGB Pro CMW32GX4M2Z3200C16
> MSI MPG A750GF
> Samsung 970 EVO 1TB (data)
> Samsung 970 Evo Plus 250GB (OS)


These are great questions! Ones I've thought of myself wondering if my 3800x purchase was the wrong one, which it turned out I had a 3900XT in my cart for several days in October and decided to not buy and then when I did, they were scarce so had to settle for the 3800x for my new build.

Anyways, I pulled some information from 3DMark Results browser on Port Royal scores and divided the top 100 scores for each 3000 and 5000 Ryzen processor, well, just the X models to keep it simpler. Here's what I found after having a little fun with pivot charts:










The 5800x has the top spot but one of the widest range of scores. The 5900x is the second spot but has all of it's top 100 scores very high.









Overall, the lowest 3080 score here is 12,349 by the 3950x . That's still pretty good! My highest 3800x 3080 score on PR is 12262 https://www.3dmark.com/3dm/56010924. (I think a lot of these scores are probably 3080's running at 450W on water.) So I don't even get in the top 100 for any of the processors but getting in the top 100 for any of the processors depends on how powerful the 3080 is running.









Speaking of overclocking, here's a breakdown of gpu clock speed by processor, which you can see a clear trend up with gpu clock setting within each processor model. An exception here is the 3600 seems to take a nose dive at higher gpu clock settings but not having to max out the gpu clock seems like a decent thing.









Then just overall, total high score by clock speed. I believe these are just the setting, not the average clock speed during the run.









Then lastly, this is a bit of an eyesore but the memory clock setting is graphed within each gpu clock setting with all the cpu's bundled together. I've noticed that a lot of the top scores are running about +3000 on the mem clock, putting them around 22,000Mhz. There's about a 16x multiplier from the actual clock to the reported 3DMark clock, so that translates to about 1375 as they report. Running at gpu clock settings above 2100Mhz puts all of these 3080's into high end territory, as in they complete the run while being overclocked. Unfortunately, this doesn't show the average clock speed obtained during the run or power level obtained.









After seeing all of this, I'd probably buy the 3600x, although the top 100 scores of 3600x are probably overclocking the cpu too.

Keep in mind these are just the top 100 scores, so it skews the numbers a bit based on what 'top 100' systems do to their systems, which are probably outside the realm of normal users, so any score in the 12,000's is probably considered great.

Cheers


----------



## obscurehifi

SanderH said:


> Feels a bit daunting to even post in this thread as most people here are pretty hardcore , but here goes nothing.
> 
> My buddy recently received his RTX 3080 (Gigabyte Vision OC) and feels like the card somehow isn't running as well as expected, especially compared to his 2070 Super. Only modest FPS gains in both 1080p & 1440p. For reference, Time Spy score = 15.692 (GPU 17.525 / CPU 9.853).
> 
> Could any of the components be causing a bottleneck?
> 
> 
> AMD Ryzen 7 3700X
> MSI X470 Gaming Plus
> 32GB Corsair Vengeance LPX CMK32GX4M2D3000C16
> Corsair RM750x
> Intel 660p 512GB
> 
> I'm starting to get a bit worried as I'm expecting to receive the exact same card very soon and my setup is pretty similar:
> 
> AMD Ryzen 7 3800X
> ASUS ROG Strix B450-F Gaming
> 32GB Corsair Vengeance RGB Pro CMW32GX4M2Z3200C16
> MSI MPG A750GF
> Samsung 970 EVO 1TB (data)
> Samsung 970 Evo Plus 250GB (OS)


The other part of the picture determining if CPU matters that much is what resolution you both are gaming or benchmarking at. For 1080p, the CPU matters a lot more but at 4k, the GPU matters more. Lower resolutions are the biggest thing that brings out CPU bottlenecks.

For Ryzen's, memory speed and timing matters a lot. I've spent the last few days playing with my memory timing at finally settled on 3733 CL16 as what maximizes my scores and make my 3800x run a bit more 'powerful.'


----------



## obscurehifi

Here's a look at the top 100 scores for Fire Strike Ultra with Ryzen 3000's and 5000's. This plays out different than the Port Royal scores. In this one, the 3600x falls behind the pack, along with the 3700x and 3800x. 3900x and 3950x play nicely with the 5000's.









This one divides out the scores, so you can better see what's contributing. Last I looked, the Graphics is 75% of the score while the other two scores have a weighting of only 10 and 15%. There's a clear case that CPU matters here, even on the Ultra settings.









Interestingly, the 3000 processors have a lower overall score than their Graphics score, showing that the reduced Combined and Physics scores are holding them back.


















Here the Physics scores more than double, so even though it's only 15% of the Overall score, it has a big affect.









Lastly, you can see clearly here that gpu clock speed matters, regardless of CPU.









Cheers


----------



## obscurehifi

Ok, here's the last I'm planning on doing. I figured I couldn't make all these charts without including Time Spy!

Time Spy appears to favor the Ryzen models in order; 5, 7 then 9.









You can see why here where the CPU score trends upward with the CPU model, where the CPU score is 15% of the total score.









I probably should look at the Time Spy Extreme numbers too because in the 3DMark Whitepaper they published, the state the Time Spy Extreme tests because they made the CPU test 3x harder on the CPU. It'll probably show a wider margin. Shoot, hang on...

Okay, here are the Time Spy Extreme numbers. Looks like the same trend.


















Hopefully someone finds these chart interesting! LOL

Cheers


----------



## ssgwright

monitorhero said:


> Is your card watercooled? @ssgwright


yes


----------



## obscurehifi

I should point out that all of my graphs above were filtered for only 3080 scores. I didn't realize I didn't label the charts in Port Royal or Fire Strike Extreme with 3080... Maybe I'll edit them at some point.


----------



## Tergon123

ZAlien said:


> Hi guys. What would be yours best 3080 recommendation? I'm currently aiming for Strix OC (waiting for it to show up at stores) mainly because of quality of the components used on that card. But is that the best choice, is there something better I'm missing?


To be honest I was going to wait for a Strix as well then discovered the MSI Suprim X, just as good if not better in someways. I have flashed the strix bios to it, to unlock a bit more Power limit, but between the fans then not being controlled correctly , and it not scoring with that bios installed as well, went back to stock MSI. It is a beast of a card even bigger heat sink and very high quality ever bit as good as the strix. Hope that helps


----------



## xc3_320w

Arni90 said:


> _think_ the cards have some phases hardwired to the PCIe-slot power, and the card is very close to the power draw limit at stock configuration, GPU-Z reported 66W power draw from the PCIe-slot pretty much constantly before I shunt-modded my card's PCIe-slot power limit. If this is true, then the only solution is to shunt-mod the PCIe shunt.


Ahh.. yes of course, stupid me.

Why aren't the cards power balancing correctly?


----------



## Comalive

Are your guys' 0 rpm fan modes working -> are your cards' idle temps fine? The Eagle OC (and probably the normal Eagle and Gaming OC) are insane heat traps so I wonder if that is somehow related to the actual GPU or if the cards are just misdesigned/mismanufactured in some way.


----------



## bmgjet

xc3_320w said:


> Ahh.. yes of course, stupid me.
> 
> Why aren't the cards power balancing correctly?


All the cards having issues are running analog controllers (up9511) that you set the balance with resistors.


----------



## xc3_320w

is it possible to modfiy just the PCI-e input phase, so that it draws more from the other phases?

i mean, we are starting to split hairs, between full shuntmod, and potentially just modifying one single component (ie ISENX(1..8) from the spec sheet for that controller)...

_edit: ISEN not PROG..._


----------



## Tergon123

Question, for all of you and I am sure it has been covered, but I didn't see it. Warzone, system specs first. i9 9900KS OC all core 5GHz. Asus Apex XI z390, Ram Crucial Ballistic Max 4000 18,19,19,39, OCed to 4400 19, 22,22,39 it is very stable. Recently updated to MSI SuprimX 3080 from 2080 S. Also now running a 1440P 2K 165 Asus monitor VG27AQ. With the 2080S I was running on 1080P 165. With the same graphics settings from one card to the other, I almost see no gain in fps, fluctuates between 150-160, thought fps would have been higher. Warzone settings a mix of low and normal, figured fps should be higher with a 3080.?


----------



## Cobra652

Hi everyone!
I go my 3080 FTW3 ultra, but now i have a problem. The PC is on LG Oled C9.
I try to use GPU SCALING via Image sharpening option in 3dsettings. Its for lower resolution a bit below 4k (3264x1836) and scale it to 4k with gpu for more FPS. But i cant.... If GPU SCALING enabled, [email protected] has 2 black borders on the side of the screen....I cant fix it with aspect ratio changing on TV, and i cant change aspect ratio settings in desktop settings... The only solution i found is to set the resolution to 4096x2160, and black bars gone...But its not the native resolution of the screen... Any solution for this problem? I found the same problem on the net, but dont get solution.... 
On my 2070Super, GPU Scaling working fine without any problem, and no black borders on [email protected] with GPU scaling enabled....
The problem is only with 3840x2160 @ 120HZ...


----------



## obscurehifi

Tergon123 said:


> Question, for all of you and I am sure it has been covered, but I didn't see it. Warzone, system specs first. i9 9900KS OC all core 5GHz. Asus Apex XI z390, Ram Crucial Ballistic Max 4000 18,19,19,39, OCed to 4400 19, 22,22,39 it is very stable. Recently updated to MSI SuprimX 3080 from 2080 S. Also now running a 1440P 2K 165 Asus monitor VG27AQ. With the 2080S I was running on 1080P 165. With the same graphics settings from one card to the other, I almost see no gain in fps, fluctuates between 150-160, thought fps would have been higher. Warzone settings a mix of low and normal, figured fps should be higher with a 3080.?


Check out the benchmarks on Eurogamer as they allow you to do comparisions in a lot of different games, at different resolutions, and compare against other cards. Wait for the content to load and they have whisker plots, video comparison of game play, as well as FPS counter through the video on a graph. The don't have the 2080 super listed but they have the 2080 Ti. Looks like what your experienced might be expected.
Nvidia GeForce RTX 3080 review: the super-performers


----------



## obscurehifi

Tergon123 said:


> Question, for all of you and I am sure it has been covered, but I didn't see it. Warzone, system specs first. i9 9900KS OC all core 5GHz. Asus Apex XI z390, Ram Crucial Ballistic Max 4000 18,19,19,39, OCed to 4400 19, 22,22,39 it is very stable. Recently updated to MSI SuprimX 3080 from 2080 S. Also now running a 1440P 2K 165 Asus monitor VG27AQ. With the 2080S I was running on 1080P 165. With the same graphics settings from one card to the other, I almost see no gain in fps, fluctuates between 150-160, thought fps would have been higher. Warzone settings a mix of low and normal, figured fps should be higher with a 3080.?


Definitely look at the link on Eurogamer I posted above on a PC browser, as it shows more content than on a smartphone, as I just noticed.

I experienced something a little different because I was on a GTX 1080 OC at 1080p and went to both a 3080 and 1440p at the same time. My experience was a little different because even though I jumped up in resolution, I still gained 50 - 100% more FPS. If you look at the charts at Eurogame, it looks like some of the games show less FPS on the 3080/1440p than the 2080Ti/1080p. Looks like your upgrade of video card was mainly to offset the increase in resolution, which is not a bad thing at all. More resolution is more costly, that's for sure!


----------



## obscurehifi

Hresna said:


> Got a ROG STRIX 3080 about a week ago. Thought I would share a recent experience involving the fans failing allowing the card to hit 83 degrees at idle...
> 
> I’ve been running the card in the quiet bios for 0db fan support and testing under volt curves in afterburner for a few days. Normally when I’m done testing for a while I’ll let the fans sit at 30%, or close afterburner and let the card’s bios take over.
> 
> Last night I closed afterburner and my wife and I settled in to watch a movie (thankfully, in the same room as the PC). About a half hour into our movie, we were jarred to hear the fans go full leaf-blower for a few seconds. I ran over to the PC to find the card had idled its way all the way to its safety cutoff of 83 degrees.
> 
> I see there’s a new firmware from asus dated 12 Dec that is supposed to improve “0db compatibility” so I will flash that today and hopefully it fixes whatever bug/glitch I encountered.
> I’m glad the card had a safety in place to stop it idling into dangerous temps, but still not thrilled that the card was hitting its 83 degrees without my knowing!


I had a similar thing happen to me yesterday because my card was stuck at 1845Mhz at idle. I closed down many apps and nothing resolved it. Rebooted but it was still there. I then reinstalled the Nvidia driver and it dropped the idle down to 210Mhz until I rebooted and then it was back at 1845. The only fix I found was to have the AORUS Engine (OC tool) that I had starting with windows. Even though I closed the AE from the windows tray, the 1845 persisted. So I unchecked the option to start with windows and reboot. It was now running at 210Mhz. I then rechecked that start with windows box and reboot and it's been fine every boot since. I had been doing a lot of overclocking benchmark runs and I think it mixed up some stuff and the routine I described fixed it. Not sure if the same thing would help your situation but maybe worth a shot with the ASUS overclocking tools, or even the MSI AB if you use that.

EDIT - spelling...


----------



## Pedros

Quick question ... Suprim X ... is LM worth it or not really? I'm going to put it under water


----------



## ssgwright

imo it is.... just risky on gpus. Temp matters when trying to push the mhz on these cards and LM works so much better than any paste


----------



## Pedros

if you isolate everything properly there's not a huge problem ... "I guess"


----------



## Falkentyne

Pedros said:


> if you isolate everything properly there's not a huge problem ... "I guess"


It's not a big deal if you're using a waterblock or AIO cooler.

For stock coolers it's annoying because if the LM hardens or the temps rise and you need to redo the application, removing old LM is VERY time consuming, and if you use 1500 grit sandpaper to help remove it with easy wiping (makes it a lot faster), the LM + Isopropyl alcohol gets everywhere and you would often have to remove the thermal pads from the cooler to save them from the LM, and they may not survive that. Which means more expensive thermal pads replacement. It's just a lot of work. Of course if you're undoing LM after like a few hours or something (example: you messed up a thermal pad or your shunt mod needs work, that's different).

There's also the issue of the convex core causing an uneven fit which is hell for LM. So if you have rising temps after a few weeks, that's the reason (LM likes a solid even fit, otherwise premature partial gallium absorption can occur, among other issues).


----------



## Pedros

yeah ... I've heard about the issue with the convex core :x 

Btw, on another note ... is that "shunt mod with glue gun" still a thing with Ampere?


----------



## Tergon123

Pedros said:


> Quick question ... Suprim X ... is LM worth it or not really? I'm going to put it under water


Yes under water it would allot, just nail polish all possible areas that it could come in contact with LM and you should be good to go.


----------



## Pedros

my guess is... bye bye warranty right?


----------



## Tergon123

Pedros said:


> my guess is... bye bye warranty right?


Yuup


----------



## Pedros

Btw, a question about the thermal pads, I was thinking about getting some Gelid GP-Extreme's or Minus 8 from TG.

Ek, in the user manual, says 1mm thick thermal pads.

Should I get the 1mm or 1.5mm? In the past I know, in some use cases, it was better to get the 1.5mm but in this case, I don't know.

Any experiences with any of the options of pads btw? findings?

( MSI Suprim X 3080 )


----------



## Falkentyne

Pedros said:


> yeah ... I've heard about the issue with the convex core :x
> 
> Btw, on another note ... is that "shunt mod with glue gun" still a thing with Ampere?


Don't use a glue gun.
You can either 

1) shunt mod with solder by desoldering and replacing the shunts (5 mOhm to 3 mOhm)--this works the best but some boards have very hard to melt solder--so this is also the hardest for people without experience (remember to Flux!), 

2) Solder by stacking shunt (5 mOhm OR 10 mOhm on top of 5 mOhm)--remember to flux!! --a great video guide on how to stack shunts and flux properly is this one:






Note: extra care needs to be done on shunts that have the edges LOWER than the middle housing, to ensure proper contact--multiple people have had issues getting good contact, so be aware!

3) Stack shunts by using MG 842AR conductive silver paint (either the pen or jar version, with a toothpick for the jar version) as a conductive adhesive without soldering--you are required to SCRAPE the silver conductive edges of the original shunt down to the bright silver, with a flat small screwdriver to remove the conformal coating off the edges, then apply paint fully on top of the edges of the shunts so that the top of the paint is ABOVE the middle housing and the edges are fully covered, then apply the new shunt on top directly, then apply another layer of paint on top of the conductive silver edges of the new shunt and then CAREFULLY across the gap on the sides and edges between the two shunts (be careful--dont get any on the PCB--use Super 33+ electrical tape around the original shunts to COVER THE PCB for safety!!)--note--you can get a bit more lower resistance by painting over the entire top of the new stacked shunt as well completely edge to edge on the top. This is easier for beginners. Due to a small resistance penalty in using 842AR paint instead of solder, 5 mOhm stacked shunts are recommended instead of higher mOhm.

NOTE: this method can be much more tricky if the edges of the original shunts are lower than the middle--then you must do extra care to make sure the silver edges are FULLY painted (and the glob of paint sits HIGHER than the middle housing before applying the new shunt on top. If the original shunts are completely flat and flush however, paint adhesive stacking is much easier. (MSI and Founder's Edition cards have those notorious shunt edges lower than the middle...this creates some difficulty especially with the left 8 pin shunt! Use Super 33+ tape to insulate the PCB!!!

4) Use MG 842AR as its own shunt (this acts as if you stacked a 10 to 15 mOhm shunt if you do this; the thicker the paint layer, the lower the resistance and the more effective the mod will be) by painting the entire surface of the original 5 mOhm shunt. You MUST scrape the edges of the original shunt down to the shiny silver to remove the conformal coating. Once again, "depressed" shunts are difficult to do because of contact issues on the edges, combined with lack of space, while 'flush' flat shunts are very easy to paint. Use Super 33+ tape to protect your PCB--trust me on this.

The circuitwriter pen will not work for 3 or 4. Multiple failures reported with the circuitwriter pen.

Be very careful you don't bridge anything on the PCB. MG842AR paint can be cleaned with isopropyl alcohol and a tiny wipe or applicator or Q-tip.


----------



## th3illusiveman

whats this stuff about an "effective clock" not matching whats shown in afterburner and thus costing some performance? Saw somewhere that AB might be be reporting the actual frequency?


----------



## lordzed83

Falkentyne said:


> Don't use a glue gun.
> You can either
> 
> 1) shunt mod with solder by desoldering and replacing the shunts (5 mOhm to 3 mOhm)--this works the best but some boards have very hard to melt solder--so this is also the hardest for people without experience (remember to Flux!),
> 
> 2) Solder by stacking shunt (5 mOhm OR 10 mOhm on top of 5 mOhm)--remember to flux!! --a great video guide on how to stack shunts and flux properly is this one:
> 
> 
> 
> 
> 
> 
> Note: extra care needs to be done on shunts that have the edges LOWER than the middle housing, to ensure proper contact--multiple people have had issues getting good contact, so be aware!
> 
> 3) Stack shunts by using MG 842AR conductive silver paint (either the pen or jar version, with a toothpick for the jar version) as a conductive adhesive without soldering--you are required to SCRAPE the silver conductive edges of the original shunt down to the bright silver, with a flat small screwdriver to remove the conformal coating off the edges, then apply paint fully on top of the edges of the shunts so that the top of the paint is ABOVE the middle housing and the edges are fully covered, then apply the new shunt on top directly, then apply another layer of paint on top of the conductive silver edges of the new shunt and then CAREFULLY across the gap on the sides and edges between the two shunts (be careful--dont get any on the PCB--use Super 33+ electrical tape around the original shunts to COVER THE PCB for safety!!)--note--you can get a bit more lower resistance by painting over the entire top of the new stacked shunt as well completely edge to edge on the top. This is easier for beginners. Due to a small resistance penalty in using 842AR paint instead of solder, 5 mOhm stacked shunts are recommended instead of higher mOhm.
> 
> NOTE: this method can be much more tricky if the edges of the original shunts are lower than the middle--then you must do extra care to make sure the silver edges are FULLY painted (and the glob of paint sits HIGHER than the middle housing before applying the new shunt on top. If the original shunts are completely flat and flush however, paint adhesive stacking is much easier. (MSI and Founder's Edition cards have those notorious shunt edges lower than the middle...this creates some difficulty especially with the left 8 pin shunt! Use Super 33+ tape to insulate the PCB!!!
> 
> 4) Use MG 842AR as its own shunt (this acts as if you stacked a 10 to 15 mOhm shunt if you do this; the thicker the paint layer, the lower the resistance and the more effective the mod will be) by painting the entire surface of the original 5 mOhm shunt. You MUST scrape the edges of the original shunt down to the shiny silver to remove the conformal coating. Once again, "depressed" shunts are difficult to do because of contact issues on the edges, combined with lack of space, while 'flush' flat shunts are very easy to paint. Use Super 33+ tape to protect your PCB--trust me on this.
> 
> The circuitwriter pen will not work for 3 or 4. Multiple failures reported with the circuitwriter pen.
> 
> Be very careful you don't bridge anything on the PCB. MG842AR paint can be cleaned with isopropyl alcohol and a tiny wipe or applicator or Q-tip.


TE]
Dont use Glue Gun cause what ??




there ya go 800w 3090 with glue gun metod NO PROBLEM


----------



## Pedros

thank you for all the replies. I'm still waiting on my local store to get back to me in terms of warranty when installing water block and ****...depending if I loose warranty by installing a block or not ... I'll do the shunt mod ... if the warranty is kept, I won't ... at least not for now ( that and LM )


----------



## eliwankenobi

Tergon123 said:


> Question, for all of you and I am sure it has been covered, but I didn't see it. Warzone, system specs first. i9 9900KS OC all core 5GHz. Asus Apex XI z390, Ram Crucial Ballistic Max 4000 18,19,19,39, OCed to 4400 19, 22,22,39 it is very stable. Recently updated to MSI SuprimX 3080 from 2080 S. Also now running a 1440P 2K 165 Asus monitor VG27AQ. With the 2080S I was running on 1080P 165. With the same graphics settings from one card to the other, I almost see no gain in fps, fluctuates between 150-160, thought fps would have been higher. Warzone settings a mix of low and normal, figured fps should be higher with a 3080.?


Warzone is badly optimized..... if you tuned your CPU, memory and GPU best they can go, I would forget about it and enjoy...


Sent from my iPhone using Tapatalk


----------



## Pedros

So, i would have to shunt all these right?


----------



## Tergon123

eliwankenobi said:


> Warzone is badly optimized..... if you tuned your CPU, memory and GPU best they can go, I would forget about it and enjoy...
> 
> 
> Sent from my iPhone using Tapatalk


It is more of the bother of in consistent frame rate, it dips and drops, and fluctuates allot, I have watch YouTube videos of similarly specs system and the frame rate doesn't move as much as mine does. One sec it is 180 fps next it is 145, which is so bad, drives me crazy. I often wonder it is the monitor holding it back it is a 144hz, Asus vg27aq which overclocks to 165hz 2K screen. In multi play I get up to 240 fps, but warzone rarly gets consistently anything over 160. It honestly used to run better and more consistent on my 2080S, almost regret updating to the MSI SuprimX 3080.


----------



## Tergon123

Pedros said:


> So, i would have to shunt all these right?
> 
> 
> View attachment 2472761


Wow is that the Suprimx 3080 naked?


----------



## Pedros

Tergon123 said:


> Wow is that the Suprimx 3080 naked?


yupes ... lots of free space to work on


----------



## Hresna

Hresna said:


> Got a ROG STRIX 3080 about a week ago. Thought I would share a recent experience involving the fans failing allowing the card to hit 83 degrees at idle...
> 
> 
> obscurehifi said:
> 
> 
> 
> I had a similar thing happen to me yesterday because my card was stuck at 1845Mhz at idle '
Click to expand...

Came to post an update and saw your post. I’m going to chalk this up to early-adopter driver instability I guess.

I had my same issue happen again on the new BIOSs. 


Spoiler



I first flashed the Q-mode bios (by the way, when you flash bios on these strix dual-bios cards, only the currently active bios gets affected - confirmed by switching to other and checking version after flashing).

I switched to P mode to continue testing, and updated that BIOS too. I was hoping that the 30% minimum fan speed in P mode would prevent any further problems.

But then I found that even in P-mode, the 0dB fan option is available and enabled as default (begging the question - what is the purpose of the dual bios then?).

Anyway, I did some more runs in afterburner then shut it down for the day, and sure enough saw in GPU-Z that my fans were at 0 as the card temp creaped up passed 55 degrees. So I shut it down.

I also noticed that if I run GPU Tweak II (Asus’ overclocking software) at any time after boot, it interferes with AB being able to control fans, so I have to do it on a fresh boot. Running GPU Tweak after AB runs seems to fix the fan curve problem but it’s not something I’m comfortable having to remember to do any time I use AB to undervolt the card.



TLDR, running AB and shutting it down deletes the fan curve altogether and the card will idle it’s way to frying temperatures, like a frog in boiling water.

Hopefully the issue gets fixed. It’s not as though I can report it to ASUS and say “hey, your card has this bug if I try to undervolt it with third party software you don’t own or support or endorse”....

EDIT: added spoiler and tldr, because tldr.


----------



## Hresna

Comalive said:


> Are your guys' 0 rpm fan modes working -> are your cards' idle temps fine? The Eagle OC (and probably the normal Eagle and Gaming OC) are insane heat traps so I wonder if that is somehow related to the actual GPU or if the cards are just misdesigned/mismanufactured in some way.


My ASUS Strix will do 0dB, but see my other posts in this thread - I wouldn’t say it “works properly”.
-The 0 rpm / 0dB mode is only supposed to be available in one of two dip-switch selectable BIOSs, but it is present on both for some reason
-If you run afterburner and then shut it down, the fans will stay stuck at 0 until the card hits 83 degrees and trips something on the card to run them at full leaf-blower


----------



## eliwankenobi

Tergon123 said:


> It is more of the bother of in consistent frame rate, it dips and drops, and fluctuates allot, I have watch YouTube videos of similarly specs system and the frame rate doesn't move as much as mine does. One sec it is 180 fps next it is 145, which is so bad, drives me crazy. I often wonder it is the monitor holding it back it is a 144hz, Asus vg27aq which overclocks to 165hz 2K screen. In multi play I get up to 240 fps, but warzone rarly gets consistently anything over 160. It honestly used to run better and more consistent on my 2080S, almost regret updating to the MSI SuprimX 3080.


Hmm... Did you DDU drivers when moving graphics cards? That is a very common issue, where settings or files from a previous driver install, affects a new card, even if both are RTX. I highly recommend doing DDU to clean up all stuff and start fresh.

Also, do you find this inconsistency only in Warzone? Try other games so you can be sure that the lack of improvement is ONLY in Warzone and not across the board. 

Regarding the Warzone videos on YouTube, performance tanked for almost everyone after the ColdWar patch, so be sure the videos you use to compare are also no older than maybe a couple of weeks or so....

Lastly, Warzone seems to be a very CPU intensive game, and you need to tune your memory to the last T. Not necessarily speed, but tighten those timings for the lowest latency possible! In that regard, Micron memory is not as effective as a good set of Samsung B-Die memory where you can push memory at over 4000 at very low timings 

Hope it’s just a DDU issue[emoji1696]


----------



## obscurehifi

Hresna said:


> Came to post an update and saw your post. I’m going to chalk this up to early-adopter driver instability I guess.
> 
> I had my same issue happen again on the new BIOSs.
> 
> 
> Spoiler
> 
> 
> 
> I first flashed the Q-mode bios (by the way, when you flash bios on these strix dual-bios cards, only the currently active bios gets affected - confirmed by switching to other and checking version after flashing).
> 
> I switched to P mode to continue testing, and updated that BIOS too. I was hoping that the 30% minimum fan speed in P mode would prevent any further problems.
> 
> But then I found that even in P-mode, the 0dB fan option is available and enabled as default (begging the question - what is the purpose of the dual bios then?).
> 
> Anyway, I did some more runs in afterburner then shut it down for the day, and sure enough saw in GPU-Z that my fans were at 0 as the card temp creaped up passed 55 degrees. So I shut it down.
> 
> I also noticed that if I run GPU Tweak II (Asus’ overclocking software) at any time after boot, it interferes with AB being able to control fans, so I have to do it on a fresh boot. Running GPU Tweak after AB runs seems to fix the fan curve problem but it’s not something I’m comfortable having to remember to do any time I use AB to undervolt the card.
> 
> 
> 
> TLDR, running AB and shutting it down deletes the fan curve altogether and the card will idle it’s way to frying temperatures, like a frog in boiling water.
> 
> Hopefully the issue gets fixed. It’s not as though I can report it to ASUS and say “hey, your card has this bug if I try to undervolt it with third party software you don’t own or support or endorse”....
> 
> EDIT: added spoiler and tldr, because tldr.


As it turns out, I can get higher overclocks with the Aorus Engine software than AB since AB memory clock stops at +1500 and Aorus Engine does not. My card doesn't really need undervolting for benchmarks as it's more stable at higher clock speeds letting it use all the voltage it wants and power limit itself with a shifted curve, +150 or +160, all on it's factory bios.


Sent from my SM-G973U using Tapatalk


----------



## Tergon123

Sorry this site doesn't play well on a phone


----------



## Tergon123

eliwankenobi said:


> Hmm... Did you DDU drivers when moving graphics cards? That is a very common issue, where settings or files from a previous driver install, affects a new card, even if both are RTX. I highly recommend doing DDU to clean up all stuff and start fresh.
> 
> Also, do you find this inconsistency only in Warzone? Try other games so you can be sure that the lack of improvement is ONLY in Warzone and not across the board.
> 
> Regarding the Warzone videos on YouTube, performance tanked for almost everyone after the ColdWar patch, so be sure the videos you use to compare are also no older than maybe a couple of weeks or so....
> 
> Lastly, Warzone seems to be a very CPU intensive game, and you need to tune your memory to the last T. Not necessarily speed, but tighten those timings for the lowest latency possible! In that regard, Micron memory is not as effective as a good set of Samsung B-Die memory where you can push memory at over 4000 at very low timings
> 
> Hope it’s just a DDU issue[emoji1696]


Will try ddu


----------



## Pedros

for DDU, start windows in Safe Mode.


----------



## Tergon123

Pedros said:


> for DDU, start windows in Safe Mode.


Yeah I will follow one of the YouTube vids to do it.


----------



## eliwankenobi

Tergon123 said:


> Will try ddu


Good luck [emoji106]


----------



## joyzao

I Can flash suprim bios in trio x? Rtx 3080


----------



## eliwankenobi

joyzao said:


> I Can flash suprim bios in trio x? Rtx 3080


Yes, people have done it. I’ve seen people here saying they’ve had better results with stock X Trio BIOS.


----------



## ducegt

Suprim on Trio working better than stock for me.


----------



## Tergon123

ducegt said:


> Suprim on Trio working better than stock for me.


That is awesome, I have a SuprimX.


----------



## Pedros

Trio and Suprim are similar. Suprim as more power phases but that's it if I recall ... even EK blocks are the same.


----------



## BluePaint

Yes, in addition to more phases, Suprim has 4 MLCCs instead of 1 of the original Trio (there was supposedly a revision with 2 MLCCs). Both use the same PCB.
Does anyone know btw, how to find out which exact kind of MLCCs are used? They don't seem to have any kind of marking cause they are too small.


----------



## AveragePC

Messed with under volting for the first time, wasn't really sure what I was doing at first but I think I figured it out. Card was stable in TimeSpy, Port Royal, Furmark, Cyberpunk 2077, Dirt Rally 2.0 (VR), and iRacing (VR).

XC3 Ultra at 1880MHz / 850 mV +600 MHz on the memory.

Timespy: 17743
PortRoyal: 11317

Seems pretty middle of the road, but it runs cooler and performance improved a bit. I'll have to test in VR, and see if my % dropped frames went down, which would be nice. The stable frequency will help with locking in graphical settings in sim racing games that don't trigger ASW.


----------



## joyzao

I got a better score with the bios on the supim x board, is it safe to use this bios on the trio x board? I noticed an increase of 6 degrees, but in port royal it worked with a clock above 2055 mhz


----------



## Broooo

Hi guys, I've a question: which card is better between MSI 3080 Suprim X and Palit 3080 GameRock OC for watercooling? The PCB differencies between the cards are: 1 VRM phase less for the Suprim X (20 vs 21 for the GameRock) but 30 ceramic capacitor more (40 vs 10 for the GameRock)
Consider that I’d like to watercool the card and if possible I’d like to flash the EVGA XOC BIOS, thanks in advance!


----------



## man from atlantis

Broooo said:


> Hi guys, I've a question: which card is better between MSI 3080 Suprim X and Palit 3080 GameRock OC for watercooling? The PCB differencies between the cards are: 1 VRM less for the Suprim X (20 vs 21 for the GameRock) but 30 ceramic capacitor more (40 vs 10 for the GameRock)
> Consider that I would like to watercool the card and if possible I would like to flash the EVGA XOC BIOS (500 WATTS), thanks in advance!


EVGA XOC bios is 450W for 3080. GameRock OC bios is 440W, Suprim X is 430W. I tried both Strix and XOC bios on my GameRock, in short they're mess on the card, I don't recommend any of them. I heard GameRock isn't compatible with water cooling blocks but wc isn't in my interest so don't take my word granted.

Sent from my ONEPLUS A5000 using Tapatalk


----------



## Pedros

so... for the Suprim X is there any really good bios to try on?


----------



## Tergon123

Pedros said:


> so... for the Suprim X is there any really good bios to try on?


I have tried quite a few of the various 450 Watts on the suprimx. Including the Strix, gigabyte, EVGA ones and they all mess up the fan control and you have to manually control the fans in afterburner. I also found the boosting and clocks to be all over the place. On the core stock bios tops out at 120 or so max. With for example the strix would run stable up to 145, but would reflect that in the score. Just odd behavior.


----------



## Broooo

man from atlantis said:


> EVGA XOC bios is 450W for 3080. GameRock OC bios is 440W, Suprim X is 430W. I tried both Strix and XOC bios on my GameRock, in short they're mess on the card, I don't recommend any of them. I heard GameRock isn't compatible with water cooling blocks but wc isn't in my interest so don't take my word granted.
> 
> Sent from my ONEPLUS A5000 using Tapatalk


Yes, I searched the web for a while and there are actually no waterblocks for the GameRock, I will probably go for Suprim X


----------



## Broooo

Tergon123 said:


> I have tried quite a few of the various 450 Watts on the suprimx. Including the Strix, gigabyte, EVGA ones and they all mess up the fan control and you have to manually control the fans in afterburner. I also found the boosting and clocks to be all over the place. On the core stock bios tops out at 120 or so max. With for example the strix would run stable up to 145, but would reflect that in the score. Just odd behavior.


So if I get a Suprim X your advice is to leave the stock BIOS? Even if I watercool it?


----------



## Broooo

What’s the best 3080 for watercooling? (I like overclocking)


----------



## bmgjet

Heres a bit of info from a friend who couldnt get his XC3 black over 320W on other bioses.
He checked all the power tables for every 3080 bios and selected the 2 plug with the highest pci-e slot power limit which is this one.








Gigabyte RTX 3080 VBIOS


10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory




www.techpowerup.com





Fan control is messed up on auto but fine if you use a custom fan profile.
Lost 2 DP ports which is ok for him since he just uses 1 DP and 1 HDMI.
Result was card pulls 358W now before its slamming into the slot power limit.
Gave him a extra 75mhz on top of his average clock speed.


----------



## ssgwright

hmm I'll try on my TUF and report back


----------



## SoldierRBT

Broooo said:


> What’s the best 3080 for watercooling? (I like overclocking)


If you like overcloking, any 2x8pin/3x8pin card shunt modded + waterblock is the way to go. If you’re not comfortable in doing a shunt mod, a 3x8pin + waterblock (450W) is a very good option.


----------



## ssgwright

GIG bios results:
















My TUF (although with tweaking I did hit over 12,850 but I didn't tweak the gig so I did a comparison:

















what's interesting is with the gig bios I only hit 69% TDP and with my current it hits 75% and if you notice the max voltage on the gig is 1.093 compared to my TUF that only hit 1.087... hmm... I might need to do some more testing with this gigabyte bios


----------



## edhutner

About MSI 3080 Suprim X. I am expecting one soon. I have custom water loop and most probably in near future will put a water block on the suprim. I have some memories in the past that some MSI cards had issues (2080ti locking to 1350mhz or something like this) when the original gpu fan headers are not used. Does anybody know if this would be the case with the suprim x?


----------



## Tergon123

Broooo said:


> So if I get a Suprim X your advice is to leave the stock BIOS? Even if I watercool it?


No if you water cool, would be worth trying other bios then, but be prepared to control the fans manually.


----------



## Pedros

well, if you watercool ... there's a high chance that you actually need to control the fans manually


----------



## edhutner

I have similar question too. What happens when you watercool the suprim x and do not use the gpu fan headers at all (leave them empty)?


----------



## Stash

BluePaint said:


> Yes, in addition to more phases, Suprim has 4 MLCCs instead of 1 of the original Trio (there was supposedly a revision with 2 MLCCs). Both use the same PCB.
> Does anyone know btw, how to find out which exact kind of MLCCs are used? They don't seem to have any kind of marking cause they are too small.


I don't think the Trio revision with 2 MLCCs was ever a real thing sadly.


----------



## MikeGR7

Stash said:


> I don't think the Trio revision with 2 MLCCs was ever a real thing sadly.


It was fake news, people confused the 3090 pcb with a 3080 "revision" lol.

No reason to fret though, Trios are beast core clockers in my experience anyway (tried a lot of cards because i buy them for friends with my VISA)

Definitely better clockers than ASUS cards on average which are all MLCC, goes to show how overrated this spec is.

Edits: Grammar


----------



## Broooo

SoldierRBT said:


> If you like overcloking, any 2x8pin/3x8pin card shunt modded + waterblock is the way to go. If you’re not comfortable in doing a shunt mod, a 3x8pin + waterblock (450W) is a very good option.


So the Suprim X is a good card for watercooling, awsome! Maybe I'll try some 450W BIOS.


----------



## Broooo

Tergon123 said:


> No if you water cool, would be worth trying other bios then, but be prepared to control the fans manually.


If I watercool the card, there will be no more fans to check hahaha 🤣


----------



## Broooo

So guys the Suprim X is a good card, even better if I watercool it and I try to change the BIOS, but are there other cards that maybe go up better in frequency? Maybe an FTW3 Ultra (for its 2 extra VRM phases, even though it has 20 fewer ceramic capacitors) or a Strix OC (for its 2 extra VRM phases and its 20 extra ceramic capacitors)? Thank you for your replies!


----------



## Warrimonk

To people with the Suprim X, how are you liking it? I placed an in-store backorder for a Suprim X (first in line!) and am selling my EVGA XC3 Ultra to a friend at-cost. 

Cant wait to shed that stupid 340W power limit and actually push a proper 3080.


----------



## SPL Tech

Idea for best BIOS for the Evga XC3? This low power limit is lame!


----------



## joyzao

I have trio x, I got the bios from suprim X there was a good improvement in the score, there was 12800 in the portroyal, can I try to put the Asus strix bios, is it safe?


----------



## xc3_320w

ssgwright said:


> what's interesting is with the gig bios I only hit 69% TDP and with my current it hits 75% and if you notice the max voltage on the gig is 1.093 compared to my TUF that only hit 1.087... hmm... I might need to do some more testing with this gigabyte bios


Why are you only hitting 250w's board power draw with either BIOS?


----------



## bmgjet

xc3_320w said:


> Why are you only hitting 250w's board power draw with either BIOS?


Shunt modded obvously.


----------



## xc3_320w

Yeh - That's what I had assumed...

For reference I tried that GB BIOS - stock settings I get just under 11500 port royal score...

I went back to the Palit BIOS - stock settings I get like 11400...

With my undervolt (2055 @ 0.912v) I get the same-ish... 11900... but still power limitted to 330w (GPUZ), regardless of what the BIOS says.. I guess the GPU is monitoring the three power supplies (PCI-e slot, 8-pin #1 and 8-pin #2) and deciding that somethings up... My max's are PCI-E = 66.5w, 8-pin #1 = 121.2w, 8-pin #2 = 142.7w - none of which are over the specification, but obviously the GPU monitors these probably in milliseconds, as opposed to the 0.1second poller GPUZ has...

So, if your not going to tweak the card, then the GB BIOS seems to be best of the bunch - but otherwise I think the Palit is marginally better... then there is the HDMI/DisplayPort issues (i havn't tested all 3 DP's on my XC3.. so cant really comment)


----------



## cennis

Could it be that the readouts on GPU-Z are wrong? I found this comment where they claim to have tested the power draw on the PSU side.

TUF 572W system draw
XC3 544W system draw
FTW3 514W system draw (the worst because XC3 doesn't have the 3rd PCIe plug)

Viakruzis Nanana2 months ago (edited)
Hey man, I like your channel and appreciate your tests. I tested flashing the FTW 450 BIOS on my XC3 Ultra gaming, and did my own tests, I am using a Corsair 1200i, so I can measure how much power the system is using. As they say, 8 Pin #1 and #3 are duplicated so you end up having less Power limit, as it reports around 140W. In my testing, with the 450 FTW Bios, I got 514 power draw from the wall running a superposition 4k optimized, with a Asus TUF Bios, 572 W, and with the XC3 Original Bios 544. (My Idle system uses around 130W)


----------



## Tergon123

Warrimonk said:


> To people with the Suprim X, how are you liking it? I placed an in-store backorder for a Suprim X (first in line!) and am selling my EVGA XC3 Ultra to a friend at-cost.
> 
> Cant wait to shed that stupid 340W power limit and actually push a proper 3080.


Well Suprim X bios power limit is 430Watts, and I love the card it is an absolute beast.


----------



## Broooo

Broooo said:


> So guys the Suprim X is a good card, even better if I watercool it and I try to change the BIOS, but are there other cards that maybe go up better in frequency? Maybe an FTW3 Ultra (for its 2 extra VRM phases, even though it has 20 fewer ceramic capacitors) or a Strix OC (for its 2 extra VRM phases and its 20 extra ceramic capacitors)? Thank you for your replies!


Any opinion?


----------



## Kold

Longshot here, but has anyone here with an XC3 successfully flashed a different bios that increased the power limit from 330watts-ish to something like 360-375? If so, can you please tell me the bios to use. I would greatly appreciate it. 

I have just received a brand new 3080 XC3 Ultra Hybrid and the card can barely maintain a steady 2000mhz without constantly hitting a power limit.


----------



## xc3_320w

Kold said:


> has anyone here with an XC3 successfully flashed a different bios that increased the power limit from 330watts-ish to something like 360-375?


Not as far as I know.

The best you can do is either the Palit/TUF OF or Auros ExtremeWater... BIOS's..

You wont get much more than 335w though...


----------



## Kold

xc3_320w said:


> Not as far as I know.
> 
> The best you can do is either the Palit/TUF OF or Auros ExtremeWater... BIOS's..
> 
> You wont get much more than 335w though...


Darn.. my only option appears to be to try and trade it for an FTW3 or TUF. Thanks.


----------



## xc3_320w

That or shunt mod it...


----------



## SPL Tech

xc3_320w said:


> Not as far as I know.


LOL isint that exactly what you did? You literally said 'For reference I tried that GB BIOS'.


----------



## SPL Tech

Kold said:


> Darn.. my only option appears to be to try and trade it for an FTW3 or TUF. Thanks.


Yes, the TUF BIOS. It will boost about 50w above the XC3 BIOS.


----------



## SPL Tech

ssgwright said:


> GIG bios results:
> View attachment 2473072
> 
> View attachment 2473073
> 
> My TUF (although with tweaking I did hit over 12,850 but I didn't tweak the gig so I did a comparison:
> View attachment 2473075
> 
> View attachment 2473076
> 
> 
> what's interesting is with the gig bios I only hit 69% TDP and with my current it hits 75% and if you notice the max voltage on the gig is 1.093 compared to my TUF that only hit 1.087... hmm... I might need to do some more testing with this gigabyte bios


What was the stock XC3 BIOS score?


----------



## xc3_320w

SPL Tech said:


> LOL isint that exactly what you did? You literally said 'For reference I tried that GB BIOS'.


yeh?

and as far as I know, you cant get much past the 330w limit...


----------



## ssgwright

shunt mod is the way to go with these cards... I can maintain 2130 on my TUF in all games (2160-2175 in port royale)


----------



## xc3_320w

what sort of average performance difference do you see (in gaming FPS) ?


----------



## ssgwright

never really checked, but I'd much rather know I'm running a game at 2130mhz vs 2050mhz


----------



## Kold

If I do a shunt mod, I can kiss my warranty goodbye? Is there a ghetto way to do it where is reversible?


----------



## ssgwright

there is.. apparently if you use conductive paint over the shunts you may be able to hot glue shunts on top as long as there's good contact and conductive compound between shunts (I tried this but without a conductive compound between shunts and it didn't work)


----------



## Nizzen

Broooo said:


> Any opinion?


It's all about lottery and cooling. Buy 100 cards, and take the card that is the best binned 

Or stick with your card that's +/- 5% of all cards in the world


----------



## Broooo

Nizzen said:


> It's all about lottery and cooling. Buy 100 cards, and take the card that is the best binned
> 
> Or stick with your card that's +/- 5% of all cards in the world


Yes, I know this, but I meant: "which cards have a better PCB (ceramic capacitors, etc) than the Suprim X?" or "which card goes up better in frequency with the same chip level?"


----------



## Kold

As far as I know, just the FTW3. Is gives the most wattage, but I wouldn't say it's a better built card.


----------



## Broooo

Kold said:


> As far as I know, just the FTW3. Is gives the most wattage, but I wouldn't say it's a better built card.


Great, thanks!


----------



## Oeli1002

Hello @all,

I have an PNY GeForce RTX 3080 10GB XLR8 Gaming REVEL EPIC-X and would like to update the BIOS. 

On TechPowerUp is version 94.02.26.40.8F linked. PNY RTX 3080 VBIOS

According to GPUZ 94.02.26.08.83 is installed.

How do I find out which version is more current? 

According to Nvflash I suspect that the already installed version 94.02.26.08.83 is more current than the one from the TPU site. 

Are there now also other BIOS versions for the PNY cards that can be flashed?

Many thanks for your help.

Regards, Thomas


----------



## Nizzen

Broooo said:


> Yes, I know this, but I meant: "which cards have a better PCB (ceramic capacitors, etc) than the Suprim X?" or "which card goes up better in frequency with the same chip level?"


Asus strix has hotwire support for voltagecontrol. One of the best buildt cards. Perfect for overclocking on water, chilled water and ln2. 
This is why I have 2x 3090 strix oc.


----------



## SPL Tech

ssgwright said:


> shunt mod is the way to go with these cards... I can maintain 2130 on my TUF in all games (2160-2175 in port royale)


So what is your new perfcap reason then? Is it voltage? What's holding the card back from going even higher?


----------



## Nizzen

SPL Tech said:


> So what is your new perfcap reason then? Is it voltage? What's holding the card back from going even higher?


It want more current


----------



## SPL Tech

Nizzen said:


> It want more current


Wouldent the shunt mod have fixed that though? By shunting it you should eliminate all power related restrictions.


----------



## Nizzen

SPL Tech said:


> Wouldent the shunt mod have fixed that though? By shunting it you should eliminate all power related restrictions.


No, the card is not magical giving more than 1.1v even when shunted 

1.15 vith vmod is almost the upper limit for high end watercooling to handle in short benchmarks


----------



## edhutner

Anyone tried Alphacool Aurora block on 3080 Trio or Suprim card?








Alphacool Eisblock Aurora Acryl GPX-N RTX 3090/3080 Suprim X mit Backplate


Der Alphacool Eisblock Aurora Acryl GPX-N RTX 3080/3090 vereint Style mit Performance und eine umfangreiche Digital RGB Beleuchtung. Die Erfahrung von über 17 Jahren sind in diesen Grafikkarten-Wasserkühler eingeflossen und stellen den...




www.alphacool.com





141 eur including the backplate sounds like a good price.


----------



## ssgwright

well, here's my latest port score:

















I scored 12 912 in Port Royal


Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





average clock of 2,183... trying to get over 2,200 but I don't think I can without getting it colder somehow... average of 41c as it is


----------



## nexxusty

bmgjet said:


> Heres a bit of info from a friend who couldnt get his XC3 black over 320W on other bioses.
> He checked all the power tables for every 3080 bios and selected the 2 plug with the highest pci-e slot power limit which is this one.
> 
> 
> 
> 
> 
> 
> 
> 
> Gigabyte RTX 3080 VBIOS
> 
> 
> 10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Fan control is messed up on auto but fine if you use a custom fan profile.
> Lost 2 DP ports which is ok for him since he just uses 1 DP and 1 HDMI.
> Result was card pulls 358W now before its slamming into the slot power limit.
> Gave him a extra 75mhz on top of his average clock speed.


Yup, thats exactly where I am at with my RTX 3080 Ventus 3X.

Smart minds bro, lol.

I did the exact same thing, checked out the power limits on every 2x8pin card, this one was the best on paper. It immediately showed a boost in Quake 2 RTX benchmarks. Glad to know there isn't anything else I can do besides a shunt mod, so.... On to the shunt mod, LOL.

I might sell this for a small profit and buy a real 3080 though... This power limit crap is very annoying.


----------



## xc3_320w

ssgwright said:


> well, here's my latest port score:
> 
> View attachment 2473454
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 912 in Port Royal
> 
> 
> Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> average clock of 2,183... trying to get over 2,200 but I don't think I can without getting it colder somehow... average of 41c as it is


Nice.

I get 11900, with just a waterblock doing about 2055Mhz.

I wonder if the 8% port royal increase translates to +8% gaming FPS?


----------



## ssgwright

ok I lied was finally able to hold 2190:


















I scored 12 973 in Port Royal


Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## AveragePC

So is the xc3 that bad of a card? LOL seeing a lot of people mentioning it’s not that great.


----------



## eliwankenobi

AveragePC said:


> So is the xc3 that bad of a card? LOL seeing a lot of people mentioning it’s not that great.


Nah... not bad. just power limited.. probably because the cooler is not sufficient and wouldn’t handle more wattage. 


Sent from my iPhone using Tapatalk


----------



## Arni90

xc3_320w said:


> With my undervolt (2055 @ 0.912v) I get the same-ish... 11900... but still power limitted to 330w (GPUZ), regardless of what the BIOS says.. I guess the GPU is monitoring the three power supplies (PCI-e slot, 8-pin #1 and 8-pin #2) and deciding that somethings up... My max's are PCI-E = 66.5w, 8-pin #1 = 121.2w, 8-pin #2 = 142.7w - none of which are over the specification, but obviously the GPU monitors these probably in milliseconds, as opposed to the 0.1second poller GPUZ has...


The PCIe slot is specified for 66W through the +12V connectors, so you are hitting the limit there.



SPL Tech said:


> Wouldent the shunt mod have fixed that though? By shunting it you should eliminate all power related restrictions.


No, the GA102 as configured in the RTX 3080 can easily hit 500W below 1050 mV. I have soldered 8 mOhm shunts on top of the stock 5 mOhm ones, and I'm still hitting the power limit in Time Spy GT2 causing throttling down to 1000 mV or so.


----------



## Kold

nexxusty said:


> Yup, thats exactly where I am at with my RTX 3080 Ventus 3X.
> 
> Smart minds bro, lol.
> 
> I did the exact same thing, checked out the power limits on every 2x8pin card, this one was the best on paper. It immediately showed a boost in Quake 2 RTX benchmarks. Glad to know there isn't anything else I can do besides a shunt mod, so.... On to the shunt mod, LOL.
> 
> I might sell this for a small profit and buy a real 3080 though... This power limit crap is very annoying.


Help me out here.. I tried the exact same bios with my XC3 Ultra and there was no increase in power limit. Also, the Power Limit slider was set to a max of "100%"

So what did you guys do differently to show it working?

*EDIT*: I have tried the TUF, ALL Gigabyte and the FTW3 + upgrade to 450 after Bios'.. and none gave me a higher score in Port Royal than just straight up undervolting on my stock bios to 0.943 @ 2055MHz. This keeps me under the 1.0volts which causes it to hit a power wall and down clock. It's unfortunate, because this is a really nice card.. but I can't help but wonder what kind of REAL WORLD FPS I am missing out on compared to say an FTW3.

Can someone with a 9900K @5GHZ and an FTW3 OCed let me know what kind of 3DMark scores they're getting. Hell, anyone with a 9900K and a higher power 3080.. I really want to see if it's worth all the hassle to sell this XC3 Ultra Hybrid and then try to get an FTW3.

*EDIT 2*: And I would consider shunt modding the card, but I upgrade GPUs often and don't want the warranty ruined on this card for when I eventually do sell it.

*EDIT 3*: Last EDIT, I promise! lol.. Here's what mine looks like score wise, power used and voltages when running an undervolted Port Royal. I mean, it's not bad, but the card is seriously crying for more volts!









I scored 12 062 in Port Royal


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Pedros

xc3_320w said:


> Nice.
> 
> I get 11900, with just a waterblock doing about 2055Mhz.
> 
> I wonder if the 8% port royal increase translates to +8% gaming FPS?


nope, you'll barely see much difference in gaming ... this is for synthetics only


----------



## xc3_320w

Kold said:


> Can someone with a 9900K @5GHZ and an FTW3 OCed let me know what kind of 3DMark scores they're getting. Hell, anyone with a 9900K and a higher power 3080.. I really want to see if it's worth all the hassle to sell this XC3 Ultra Hybrid and then try to get an FTW3.


12k is a pretty good score. Which BIOS? was it the stock hybrid one?

Even if 3Dmark scores translate to gaming FPS, your looking at maybe 7% by going the shunt mod route (which is probably going to net you better performance than just a XOC BIOS on say a FTW3)... 

7% = ssgwright was getting 129XX with a shutmod, on a TUF OC, on water...


----------



## SPL Tech

Nizzen said:


> No, the card is not magical giving more than 1.1v even when shunted
> 
> 1.15 vith vmod is almost the upper limit for high end watercooling to handle in short benchmarks


So how much of a difference did you see between stock settings and the shunt mod? I am wondering if it's worth it. If I gain 5 FPS at 4k I would say its worth it. But it's not worth it to me for like 1 FPS.


----------



## Falkentyne

SPL Tech said:


> So how much of a difference did you see between stock settings and the shunt mod? I am wondering if it's worth it. If I gain 5 FPS at 4k I would say its worth it. But it's not worth it to me for like 1 FPS.


On a 3090, between 400W and 530W is about maybe 9% fps at 4k, only tested in Overwatch at 200% render scale (1080p =4x SSAA thus 4k). Going to be more if your base TDP is lower.


----------



## xc3_320w

SPL Tech said:


> If I gain 5 FPS at 4k


Yeh, i know what you mean, I am in the same boat.

But I think we need to think more about 1% lows, rather than average FPS... At least for me, those 1% lows are way more important than average FPS... for example, I dont really care about my average FPS going 170 -> 190, but if my 1% lows go from 90->130... (lol), then that would be a reason to shunt it...

But then, I reckon your "formula" needs to take into account things like CPU and RAM timings, ie becomes a LOT more complicated than just GPU Mhz


----------



## Falkentyne

xc3_320w said:


> Yeh, i know what you mean, I am in the same boat.
> 
> But I think we need to think more about 1% lows, rather than average FPS... At least for me, those 1% lows are way more important than average FPS... for example, I dont really care about my average FPS going 170 -> 190, but if my 1% lows go from 90->130... (lol), then that would be a reason to shunt it...
> 
> But then, I reckon your "formula" needs to take into account things like CPU and RAM timings, ie becomes a LOT more complicated than just GPU Mhz


CPU and RAM frequency and timings (especially sub/terts) have a far bigger impact on 1% lows than the video card. If you're getting frame stutters/hitches, a shunt mod isn't going to prevent that. Back in the Pascal days, hitting the power limit would sometimes cause extreme frame hitches if you were on one of the Nvidia power saving power management modes, and setting it to prefer maximum performance would remove that. I don't remember if that was a driver issue or something related to MXM cards (laptop), but that was an easy fix.


----------



## xc3_320w

Falkentyne said:


> Nvidia power saving power management modes


i have it on "normal" at the moment, because the alternative "maximum performance" makes the card idle at about 100w...

but I agree with everything you said...


----------



## Rawfodog

Think this is the highest I was able to go without tweaking voltage curve.
+120 core, +800 mem, avg temp 61c with Arous Xtreme aircooled.
This is oc passes in Timespy, Heaven and SotTR, Firestrike Ultra



http://www.3dmark.com/pr/764875


















I scored 11 836 in Fire Strike Ultra


AMD Ryzen 7 3700X, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com













(Edit): With case and gpu fans at 100% + open window.
I still couldn't get higher than 120core, but at 1200mem I saw no loss and got this one 








I scored 12 507 in Port Royal


AMD Ryzen 7 3700X, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Kold

I would greatly appreciate it if someone with an OCed 3080 FE could show me what kind of Port Royal scores you're getting. Thanks 🙏


----------



## Falkentyne

What's funny is I've seen FAR more shunt modded 3090's than 3080's. I've only seen a few people mod their 3080's, and I don't think any 500W vBioses exist to even attempt cross flashing on them?


----------



## SPL Tech

Falkentyne said:


> What's funny is I've seen FAR more shunt modded 3090's than 3080's. I've only seen a few people mod their 3080's, and I don't think any 500W vBioses exist to even attempt cross flashing on them?


well if you shunt mod them then it does not matter what the BIOS power limit is. it overrides it entirely.


----------



## Krisztias

Kold said:


> I would greatly appreciate it if someone with an OCed 3080 FE could show me what kind of Port Royal scores you're getting. Thanks 🙏


Hi!

Power maxed out (370W), core +125, mem +1150, under water (Alphacool block):


----------



## Pedros

is that with the stock bios? because with those settings and that score, you need to be using more than the stock power envelope


----------



## Krisztias

Pedros said:


> is that with the stock bios? because with those settings and that score, you need to be using more than the stock power envelope


Yes, only forgot to mention (I tought it's obvious), that I maxed out the power slider to 370W.
I tought, that FE cards can't be flashed.


----------



## Pedros

that's cool  i can get those results but with a 430W bios  but my sample is not the best ( Suprim X )


----------



## c0nsistent

Pedros said:


> that's cool  i can get those results but with a 430W bios  but my sample is not the best ( Suprim X )
> View attachment 2473780


Are you using a voltage curve or just the bios? I did a shunt mod and I’ve noticed some difference but it seems the card just wants to pull more voltage to use the extra tdp unless you tell it otherwise


----------



## edhutner

Hi @Pedros, what waterblock you are using on your Suprim X, and also did you use liquid metal or regular paste?


----------



## Pedros

Lol I'm on air and stock, didn't change anything yet


----------



## Kold

If I manage to trade my XC3 Hybrid for an FE, do you guys think that's a fair trade?


----------



## Pedros

Why would you do that? Sorry, really asking because usually it's the other way around


----------



## Kold

It has a higher power target.

And, technically, it should retain its value better than the XC3 when it comes time to sell and upgrade.


----------



## outofmyheadyo

My 3080 "xtreme" WB is such a turd, it says powerlimit is 370W but it never actually uses more than 333W...
Cant really manage more in portroyal than this https://www.3dmark.com/pr/769663

Average clock with +100 is 2000mhz, while average temp was 42...


----------



## AveragePC

Man my xc3 is a dog, only a 11,317 pr score and 17,743 ts score. That’s at 1880mhz at 850mv, +600 memory. 

Anything I should try differently?


----------



## xc3_320w

yes.

try ~2010Mhz @ 875mV

you will likely top-out about 12k (just under for me).


----------



## Knoxx29

Nothing special:
Clock 1965Mhz - Voltage 0.919V - Memory stock


----------



## Peter Watson

My Ftw3 ultra 450w bios, +125 gpu and +999 mem, on air https://www.3dmark.com/pr/771558
If I go over +1100 mem my scores go down.


----------



## Pedros

damn, can you use +125 on a daily basis?


----------



## Peter Watson

Pedros said:


> damn, can you use +125 on a daily basis?


Lol no I wouldn't risk the crash. Gaming I can hit over 75c at +125 I think it would defiantly crash. At +135 I crash on port royal at 56c. I'm desperate for a water block but there just not out yet.
Only way to get into 13000s is water block on these cards.


----------



## Pedros

i'm receiving my water block somewhere in beginning of Feb. But I'm still building the loop so ...  maybe during Feb I'll have it working


----------



## Peter Watson

Pedros said:


> i'm receiving my water block somewhere in beginning of Feb. But I'm still building the loop so ...  maybe during Feb I'll have it working


Nice, I'm waiting for the Alphacool one,


----------



## AveragePC

xc3_320w said:


> yes.
> 
> try ~2010Mhz @ 875mV
> 
> you will likely top-out about 12k (just under for me).


I’ll give it a shot. Hopefully it’s stable and temps are lower than stock still. I was hitting 78C stock. At my 1880mhz undervolt max I saw was 73C.


----------



## Knoxx29

AveragePC said:


> I was hitting 78C stock. At my 1880mhz undervolt max I saw was 73C.


Doing what?


----------



## leegoocrap

Started going down the rabbit hole and now here we are 
XC3 Ultra Gaming 3080, just installed the AIO. After tons of toying around in PX1 and Afterburner, the best I could muster is 11624 (+210core +500 memory)









I scored 11 624 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com





Despite keeping things sub 45° (done some runs with higher fan profile at under 40) boost won't stay over 2k for any consistency. Basically sits on 1950-1980 no matter the OC.


It seems like Shunt modding is in my future :/


----------



## AveragePC

Knoxx29 said:


> Doing what?


VR sim racing, iRacing.


----------



## Knoxx29

Maybe i could be wrong but 1880mhz undervolt max 73C. it is kinda high?


----------



## AveragePC

Knoxx29 said:


> Maybe i could be wrong but 1880mhz undervolt max 73C. it is kinda high?


Yeah that's what I thought. I'm using the stock fan curve, I didn't touch that. I believe that was during benchmarks, but I'll double check tonight. Before undervolting CB2077 was running at 75C stock, undervolted at 1880mhz it was 67 C.


----------



## Knoxx29

Maybe i could be wrong but 1880mhz undervolt max 73C. it is kinda high?


AveragePC said:


> Yeah that's what I thought. I'm using the stock fan curve, I didn't touch that. I believe that was during benchmarks, but I'll double check tonight.


We have the same card and mine is 1965Mhz 0.919V and the card runs cool even when running Heaven Benchmark, i am using stock fan curve too


----------



## AveragePC

Knoxx29 said:


> Maybe i could be wrong but 1880mhz undervolt max 73C. it is kinda high?
> 
> We have the same card and mine is 1965Mhz 0.919V and the card runs cool even when running Heaven Benchmark, i am using stock fan curve too


I’ll try targeting 1950 at 875mv or around there, and run heaven. See what’s going on temp wise.


----------



## Knoxx29

Does Airflow matter?


----------



## AveragePC

Knoxx29 said:


> Does Airflow matter?


CPU temps are great, and the case is known for having great airflow. I'm running it in the same configuration that performed best for GamerNexus during their testing.


----------



## Knoxx29

AveragePC said:


> CPU temps are great, and the case is known for having great airflow. I'm running it in the same configuration that performed best for GamerNexus during their testing.


My question was referring to the GPU, anyway i assume those temps you saw were while benching.


----------



## AveragePC

Knoxx29 said:


> My question was referring to the GPU, anyway i assume those temps you saw were while benching.


I know what you meant, but my only point of reference was that at the time of those higher gpu temps, cpu was fine which i would assume means the gpu should have been fine, air flow wise. My cpu is right around what everyone else sees on avg. also read the xc3 runs on the warm side. My evga gtx 1080 SC ran around 78C as well stock. 

Was during benchmarks.


----------



## Cobra652

A little stupid question maybe...
I have an evga 3080 ftw3 ultra (xoc), ryzen 5800x,water cooling,etc etc... The psu is an EVGA Supernova G2 750w gold. I want to change it to corsair rm1000x 1000w gold. Is it worth it?... Thanks


----------



## mischi7

Has somebody already flashed another bios version on a gigabyte 3080 gaming oc? Does the asus TUF bios work?


----------



## Peter Watson

leegoocrap said:


> Started going down the rabbit hole and now here we are
> XC3 Ultra Gaming 3080, just installed the AIO. After tons of toying around in PX1 and Afterburner, the best I could muster is 11624 (+210core +500 memory)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 624 in Port Royal
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Despite keeping things sub 45° (done some runs with higher fan profile at under 40) boost won't stay over 2k for any consistency. Basically sits on 1950-1980 no matter the OC.
> 
> 
> It seems like Shunt modding is in my future :/


Something not right there, must be a cooling issue, it's downclocking too much, 11k is far too low..


----------



## acoustic

Peter Watson said:


> Something not right there, must be a cooling issue, it's downclocking too much, 11k is far too low..


Mid-high 11K doesn't seem bad considering it's a 2x 8pin card and likely not utilizing the full power limit with the 320watt issue. +210 core but still dropping under 2000 means he's banging the power limit.

I would try doing a voltage curve and utilize .875v to .925v, and see what clocks you can manage at those voltages. If you have a good chip, you can very well hold above 2000Mhz and alleviate some of the issues with the power limit.

Shunt modding would be beneficial, absolutely.


----------



## krs360

Hello all,

Wondering if there's the possibility of flashing the 3080 founders edition card with another bios or is the only option to increase power limits a shunt mod?

Thanks


----------



## eliwankenobi

xc3_320w said:


> yes.
> 
> try ~2010Mhz @ 875mV
> 
> you will likely top-out about 12k (just under for me).


Damn! I wish I could achieve that. Gotta try. I also have a 3080 XC3 Ultra Hybrid. I can do [email protected] 850mV stable... but with the hybrid better temps I am just running a +120 core/ +500 memory and it's high on games, but I don't break 12k on PR...


----------



## eliwankenobi

leegoocrap said:


> Started going down the rabbit hole and now here we are
> XC3 Ultra Gaming 3080, just installed the AIO. After tons of toying around in PX1 and Afterburner, the best I could muster is 11624 (+210core +500 memory)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 624 in Port Royal
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Despite keeping things sub 45° (done some runs with higher fan profile at under 40) boost won't stay over 2k for any consistency. Basically sits on 1950-1980 no matter the OC.
> 
> 
> It seems like Shunt modding is in my future :/


This has been my experience as well. This is the highest I've achieved running a custom curve









I scored 11 798 in Port Royal


AMD Ryzen 7 3800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Knoxx29

I have a question, i have set my card to run at 1965Mhz but in games it runs all the time at 1980Mhz, any idea why is this happening?


----------



## asdkj1740

on rtx 3000 series seahawk, msi has changed to parnter with asetek direclty rather than coolit throught corsair.





MSI SEA HAWK 240mm GPU AIO Enables Extreme Performance - Asetek


MSI has introduced a new series of graphics cards based on the NVIDIA® Ampere GPU architecture for GeForce RTX™ 30-Series graphics cards. The SEA HAWK 240mm AIO is built to handle the power hungry 30-Series graphics cards, which can dissipate up to 350-Watts when under a full load.




asetek.com


----------



## AveragePC

Knoxx29 said:


> My question was referring to the GPU, anyway i assume those temps you saw were while benching.


Reran port royal this time at 1950mhz at 875mV +600 memory only scored 11,592, TimeSpy score of 17,980, hit 75 C.

At 1880mhz / 850mV +600 memory PR scored 11,317, TimeSpy score of 17,743, hit 73C.

Cyberpunk crashed at 1950mhz 875mV +600 memory, bumped to 881mV, see what happens. Noticed temps went up to 71C vs 70C at 875mV. In game maintains 1935mhz. 

Maybe my card just sucks lol.

At what temp does this card start to throttle?


----------



## mouacyk

mischi7 said:


> Has somebody already flashed another bios version on a gigabyte 3080 gaming oc? Does the asus TUF bios work?


Yes. I have an Eagle OC. Saw someone on TPU flashed Tuf OC onto Gaming OC so I checked ports and pins and they match. Flash works for me.

Cant wait for my block... but this card runs in low 60s, which seems absurdly low but ambient had dropped to 20C and it was running at 0.95v instead of normal ~1.06v. It was hitting 72C in 22C ambient at normal voltage, so hopefully the TUF OC BIOS didn't mess with my temperature reporting.


----------



## Peter Watson

acoustic said:


> Mid-high 11K doesn't seem bad considering it's a 2x 8pin card and likely not utilizing the full power limit with the 320watt issue. +210 core but still dropping under 2000 means he's banging the power limit.
> 
> I would try doing a voltage curve and utilize .875v to .925v, and see what clocks you can manage at those voltages. If you have a good chip, you can very well hold above 2000Mhz and alleviate some of the issues with the power limit.
> 
> Shunt modding would be beneficial, absolutely.


Must be my cpu overclock, as at a reduced power limit 320w I get over 11,900 points, https://www.3dmark.com/pr/774837

I was on the power limit all the way through the test.


----------



## eliwankenobi

Knoxx29 said:


> I have a question, i have set my card to run at 1965Mhz but in games it runs all the time at 1980Mhz, any idea why is this happening?
> 
> View attachment 2474028


Yes. The GPU BOOST 3.0 is always on. So if it sees it has the opportunity to boost a bin higher (bin is +/- 15mhz) it will do so. But no more than that. If you stress the card high enough, like with a TimeSpy Extreme load, it may go down one or two bins as well. But for the vast majority of the time, it should stay within the custom curve you set


Sent from my iPhone using Tapatalk


----------



## blurp

I own a RTX 3080 FTW3 with 450w BIOS. I consider undervolting the best way to achieve the best performance / watt. Here are my Afterburner profiles using curves. All stables.
#1 1860 @ 875 mv Mem + 800 Superposition 4k 14062 59C 295w Port Royale 11452
#2 1920 @ 925 mv Mem + 800 Superposition 4k 14462 65C 325W
#3 1950 @ 943 mv Mem + 800 Superposition 4k 14707 67C 349W
#4 2010 @ 1000mv Mem + 800 Superposition 4k 15004 72C 398W 
#5 2025 @ 1068mv Mem + 800 Superposition 4k 15962 75C 436W Port Royale 12317

My daily Profile is the #1. Silence and great performance!


----------



## cstkl1

request 
anybody with 10900k or latest zen 3 benchmark horizon zero dawn.. performance, quality and ultimate @ 1080p..

saw a leak for 11900k. want to confirm.


----------



## edhutner

@cstk1

Ryzen 5900X +30% limits, 50mv -offset, curve optimizer, 32GB 3800
3080 Suprim X on air 110% power, +100mhz core, +500mhz mem
perfromance: 218fps
quality: 182fps
ultimate quality: 161fps


----------



## kuwabu

Colonel_Klinck said:


> Can I ask what bios version installed after you used the ASUS RTX3080_V2.exe updater? If that is how you did update.
> 
> Mine is now 94.02.42.40.66 which I can't find on the techpowerup bios page
> 
> 
> 
> 
> 
> 
> 
> 
> TechPowerUp
> 
> 
> Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.
> 
> 
> 
> 
> www.techpowerup.com


May I know what mode you are running that is showing this bios? is it performance or quiet mode?


----------



## Hresna

eliwankenobi said:


> Knoxx29 said:
> 
> 
> 
> I have a question, i have set my card to run at 1965Mhz but in games it runs all the time at 1980Mhz, any idea why is this happening?
> 
> View attachment 2474028
> 
> 
> 
> Yes. The GPU BOOST 3.0 is always on. So if it sees it has the opportunity to boost a bin higher (bin is +/- 15mhz) it will do so. But no more than that. If you stress the card high enough, like with a TimeSpy Extreme load, it may go down one or two bins as well. But for the vast majority of the time, it should stay within the custom curve you set
> Sent from my iPhone using Tapatalk
Click to expand...

The VF curve is always temperature dependant - you'll see that effect if you hit "reset" on a cold card and compare against the curve you get if you press it when the card is warm. Any time you "apply" a curve, it adjusts based on the temperature of the card. The setting you are actually modifying with a custom curve is not the actual boost frequencies, but what the OFFSET is that will be applied for each voltage setting.
Once you go live, the actual frequencies you boost will depend on where you are on the VF curve you can't see under the hood, and is affected by temperature, power limite, and other factors. But the card will always try to hit +X offset at that voltage.


----------



## wilchy

Hey guys,

My best score on Port Royal with my 3080 Suprim X on air. Seems pretty good but ambient temps were pretty cold

Just short of 1300 plus! I really wanted to hit the 13xx mark









I scored 12 963 in Port Royal


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## edhutner

I get 11856 on port royal with
3080 suprim x, gaming bios, on air,
volt+30%,pow110%,core+100mhz,mem+500mhz, mild fan curve

Dont know what core clock it maintains. Waiting for waterblock and then will experiment with more overclocking.


----------



## itssladenlol

lordzed83 said:


> TE]
> Dont use Glue Gun cause what ??
> 
> 
> 
> 
> there ya go 800w 3090 with glue gun metod NO PROBLEM


Frame chasers is the biggest ****** out there.


----------



## Colonel_Klinck

kuwabu said:


> May I know what mode you are running that is showing this bios? is it performance or quiet mode?


Performance mode


----------



## Colonel_Klinck

Hey guys. So I finally got around to shunt modding my TUF OC. Did all the shunts, stacking R008 on top with 842AR pen. What I'm finding a little odd is that it appears to have lower power readings, idle dropped from 100w to 85w and my peak before was 358w but now it hits a max of around 338w. it shows power limit in MSI AB very often well bellow 300w. In Heaven it will show PL at 297w. Is this right? Have I not attached one of the shunts properly? I've not spent long tweaking frequencies but scores in Timespy and Port Royal are ok I think. Any comments? 




























Timespy 
18560
Graphics Score 19343

Port Royal 
12548


----------



## Tergon123

wilchy said:


> Hey guys,
> 
> My best score on Port Royal with my 3080 Suprim X on air. Seems pretty good but ambient temps were pretty cold
> 
> Just short of 1300 plus! I really wanted to hit the 13xx mark
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 963 in Port Royal
> 
> 
> Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


That is a very nice score


----------



## Knoxx29

wilchy said:


> Hey guys,
> 
> My best score on Port Royal with my 3080 Suprim X on air. Seems pretty good but ambient temps were pretty cold
> 
> Just short of 1300 plus! I really wanted to hit the 13xx mark
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 963 in Port Royal
> 
> 
> Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


Am i mistaken or you are using a Mod?


----------



## ssgwright

Knoxx29 said:


> Am i mistaken or you are using a Mod?


yeah... and he also says he's on air and averaged 46c @2190mhz on port lol yeah right...


----------



## ssgwright

here's my best which is just above him... http://www.3dmark.com/pr/759122 12,975 my average temp was 39c because I was on CHILLED WATER and he is only 5c above that on air??? I'm calling shannanigans... (unless he used jayz2cents mobile ac cardboard box mod)


----------



## BluePaint

Its possible 12847 on Trio with strix bios on 5C air and 5800x 2150 avg 46C avg
https://www.3dmark.com/pr/522789
Might try for more next time when its below 0 

I am also waiting for waterblock since weeks which has been pushed back another 6 weeks which sucks.


----------



## xc3_320w

Colonel_Klinck said:


> it shows power limit in MSI AB very often well bellow 300w. In Heaven it will show PL at 297w. Is this right? Have I not attached one of the shunts properly?


your hwinfo screenshot shows whats going on - ~150w out of the one of the 8-pins..

so even though your under the max power draw for the whole card, the GPU has decided (correctly) that the draw from one of the rails is exceeding spec - so its appling a power limit.


----------



## hubsahubsa

Is it possible to get a significantly higher power limit on a 2x8 pin card? I have an Eagle OC, tried Gaming OC BIOS but the performance gains were small for the extra 30 watts and it also broke my idle power states so i switched back to the Eagle OC BIOS. Back in the Maxwell days you could just edit the BIOS yourself however you wanted, that's not possible anymore?


----------



## xc3_320w

Not without modifying the card, or finding a BIOS that is essentially unlocked (which will likely never happen).

You need to appreciate that 375w is essentially the maximum power draw that a 2-pin card is ALLOWED according to the various specs... 

The "easiest" way to overcome this limit is by shunt modding the card.

You need to decide if that is worth it vs the fact you will be voiding your warranty.

Personally, I am yet to see a compelling argument for shunt mod (other than internet points, which may be of use to some...)... it appears to be at maximum a 10% performance (avg fps) increase, with a sizable power usage increase... which "today" isn't worth it for me


----------



## hubsahubsa

Yeah, I wont do hardware mods. My cables, PSU and cooling can handle more power so it feels dumb to not give more power. Well hopefully we can edit the vbios in the future


----------



## xc3_320w

hubsahubsa said:


> My cables, PSU and cooling can handle more power so it feels dumb to not give more power


Yep I know! I am kinda kicking myself that I didn't get a 3-pin myself, however it is what is it, and my card choice was dictated by form factor... NR200P, vertical GPU, with a topmount rad meant FTW3 was/is out of the question.


----------



## EarlZ

I've got a choice between an MSI Suprim X 3080 that has 3X8PIN vs a Asus TUF 3080, however the MSI is $226 USD more expensive, Is that extra power headroom worth all that cash , I am not even sure what kind of a performance difference to expected on a 2X8Pin vs 3x8Pin once the power limit is reached.


----------



## eliwankenobi

Colonel_Klinck said:


> Hey guys. So I finally got around to shunt modding my TUF OC. Did all the shunts, stacking R008 on top with 842AR pen. What I'm finding a little odd is that it appears to have lower power readings, idle dropped from 100w to 85w and my peak before was 358w but now it hits a max of around 338w. it shows power limit in MSI AB very often well bellow 300w. In Heaven it will show PL at 297w. Is this right? Have I not attached one of the shunts properly? I've not spent long tweaking frequencies but scores in Timespy and Port Royal are ok I think. Any comments?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Timespy
> 18560
> Graphics Score 19343
> 
> Port Royal
> 12548


This is normal. Remember, you are now passing current through places where the board sensors cannot sense them (ie the added resistors)


----------



## Falkentyne

Colonel_Klinck said:


> Hey guys. So I finally got around to shunt modding my TUF OC. Did all the shunts, stacking R008 on top with 842AR pen. What I'm finding a little odd is that it appears to have lower power readings, idle dropped from 100w to 85w and my peak before was 358w but now it hits a max of around 338w. it shows power limit in MSI AB very often well bellow 300w. In Heaven it will show PL at 297w. Is this right? Have I not attached one of the shunts properly? I've not spent long tweaking frequencies but scores in Timespy and Port Royal are ok I think. Any comments?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Timespy
> 18560
> Graphics Score 19343
> 
> Port Royal
> 12548


Theres nothing wrong with your mod. Heaven will draw up to 520W, especially at those high clocks you're pushing through that 3080. You also forgot to make a multiplier for the power draw being reduced. I assume its going to be between 1.33* to 1.40* multiplier of your board power reading.


----------



## Colonel_Klinck

Ok thanks guys 

edit: the reason I asked is I was surprised to see power limit shown in AB at under 300w as I thought that the card now thinks its drawing less than 300w so it isn't near the power limit. I was expecting to see lower power draw being reported though.


----------



## Swatson

I have an ASUS EKWB 3080 on order, but I'm severly reconsidering after seeing this thread. I was hoping I could throw on the Strix OC BIOS and call it a day but it seems like even the EVGA XOC BIOS will not actually exceed the pcie spec power limit? Does anyone know what the TDP of the ASUS EKWB actually is? I just see 320W everywhere, not sure if actual or default.

Why did these companies make waterblock GPUs that can't ****ing OC?


----------



## hubsahubsa

Swatson said:


> I have an ASUS EKWB 3080 on order, but I'm severly reconsidering after seeing this thread. I was hoping I could throw on the Strix OC BIOS and call it a day but it seems like even the EVGA XOC BIOS will not actually exceed the pcie spec power limit? Does anyone know what the TDP of the ASUS EKWB actually is? I just see 320W everywhere, not sure if actual or default.
> 
> Why did these companies make waterblock GPUs that can't ****ing OC?


That watercooled card has a 366 watt power limit. As far as i know you can't increase it without hardware modifications except for a massive 9 watt increase if you flash a TUF bios


----------



## leegoocrap

Swatson said:


> I have an ASUS EKWB 3080 on order, but I'm severly reconsidering after seeing this thread. I was hoping I could throw on the Strix OC BIOS and call it a day but it seems like even the EVGA XOC BIOS will not actually exceed the pcie spec power limit? Does anyone know what the TDP of the ASUS EKWB actually is? I just see 320W everywhere, not sure if actual or default.
> 
> Why did these companies make waterblock GPUs that can't ****ing OC?


Any of the 2 connector cards are going to run right up to 340w and that's the wall. Without shunting/etc that's all you're going to get with any bios that's currently available. I've flashed pretty much every bios to my xc3 ultra hybrid, and they're all 336-340 total.


----------



## hubsahubsa

leegoocrap said:


> Any of the 2 connector cards are going to run right up to 340w and that's the wall. Without shunting/etc that's all you're going to get with any bios that's currently available. I've flashed pretty much every bios to my xc3 ultra hybrid, and they're all 336-340 total.


My Eagle OC with a Gaming OC bios went up to 370 W no problem


----------



## leegoocrap

Nice, I don't doubt you, but that isn't most folks experience from all I've read


----------



## Falkentyne

Colonel_Klinck said:


> Ok thanks guys
> 
> edit: the reason I asked is I was surprised to see power limit shown in AB at under 300w as I thought that the card now thinks its drawing less than 300w so it isn't near the power limit. I was expecting to see lower power draw being reported though.


MSI Afterburner (and GPU-Z, hwinfo etc) can flag that a power event has occurred even if you aren't actually throttling yet. Your picture in your screenshot of Heaven running showed power with the card running at full clocks. You can get a throttle limit 'warning' when you get close to the power limit. Example, you may see the 'warning' if you get to 365W (reported) with a 400W limit (these are just limits reported to the video card, based on this example, not your true power draw). You should have GPU-Z open when doing your testing. If the green throttle bar is only like 20% of the way "up" the bar (meaning it wasn't filling up the bar to the top horizontally but is only a small blip, or 'squished' downwards), it means it's simply an alert, not an actual throttle happening. Also, power limit throttling doesn't happen at max TDP. Heavy throttling happens at max TDP. You actually begin throttling (slowly) before max TDP. For example, at 375W (out of 400W), your card may drop -15 mhz or a voltage point on the v/f graph, as it tries to reduce the power draw slightly. This throttle becomes stronger as the TDP continues to go up. If it reaches 400 TDP, then you would see something as drastic as -250 mhz on the clocks (this depends on other factors) and voltage would be as low as 0.875v or something. It should then be very obvious why you don't get full power all the way up to 400W and then a massive throttle right after--this would cause terrible performance and stuttering.


----------



## Swatson

hubsahubsa said:


> That watercooled card has a 366 watt power limit. As far as i know you can't increase it without hardware modifications except for a massive 9 watt increase if you flash a TUF bios





leegoocrap said:


> Any of the 2 connector cards are going to run right up to 340w and that's the wall. Without shunting/etc that's all you're going to get with any bios that's currently available. I've flashed pretty much every bios to my xc3 ultra hybrid, and they're all 336-340 total.


Thank you for the confirmation. At this point I'm very conflicted but probably going to keep the order. I havent had a desktop since November and it's getting old using even a relatively powerful laptop 24/7. The fact that I'm buying one of these 10gb meme cards is bad enough, I'll probably flip it when 3080ti comes out or 6900xt toxic which were both my original plans.


----------



## Colonel_Klinck

Falkentyne said:


> MSI Afterburner (and GPU-Z, hwinfo etc) can flag that a power event has occurred even if you aren't actually throttling yet. Your picture in your screenshot of Heaven running showed power with the card running at full clocks. You can get a throttle limit 'warning' when you get close to the power limit. Example, you may see the 'warning' if you get to 365W (reported) with a 400W limit (these are just limits reported to the video card, based on this example, not your true power draw). You should have GPU-Z open when doing your testing. If the green throttle bar is only like 20% of the way "up" the bar (meaning it wasn't filling up the bar to the top horizontally but is only a small blip, or 'squished' downwards), it means it's simply an alert, not an actual throttle happening. Also, power limit throttling doesn't happen at max TDP. Heavy throttling happens at max TDP. You actually begin throttling (slowly) before max TDP. For example, at 375W (out of 400W), your card may drop -15 mhz or a voltage point on the v/f graph, as it tries to reduce the power draw slightly. This throttle becomes stronger as the TDP continues to go up. If it reaches 400 TDP, then you would see something as drastic as -250 mhz on the clocks (this depends on other factors) and voltage would be as low as 0.875v or something. It should then be very obvious why you don't get full power all the way up to 400W and then a massive throttle right after--this would cause terrible performance and stuttering.



Thanks dude, that makes sense.


----------



## ssgwright

well... went for another run at 13k in port royale... got my personal best at 12,980... just can't get those dang 20 more points









I scored 12 980 in Port Royal


Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## hubsahubsa

Flashing a TUF OC bios to an Eagle OC card should be fine right? I wanna try if that BIOS actually doesn't break my idle power states like Gaming OC did


----------



## Peter Watson

ssgwright said:


> well... went for another run at 13k in port royale... got my personal best at 12,980... just can't get those dang 20 more points
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 980 in Port Royal
> 
> 
> Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


Awesome score, I'm thinking of having a go but I don't think I can get to 13k on air.


----------



## cstkl1

Swatson said:


> I have an ASUS EKWB 3080 on order, but I'm severly reconsidering after seeing this thread. I was hoping I could throw on the Strix OC BIOS and call it a day but it seems like even the EVGA XOC BIOS will not actually exceed the pcie spec power limit? Does anyone know what the TDP of the ASUS EKWB actually is? I just see 320W everywhere, not sure if actual or default.
> 
> Why did these companies make waterblock GPUs that can't ****ing OC?


thats a ref card pcb ya


----------



## hubsahubsa

ssgwright said:


> well... went for another run at 13k in port royale... got my personal best at 12,980... just can't get those dang 20 more points
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 980 in Port Royal
> 
> 
> Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


Do you know how much power it uses? That score is absolutely nuts to me. I can barely hit 12k with 370 watts


----------



## Peter Watson

hubsahubsa said:


> Do you know how much power it uses? That score is absolutely nuts to me. I can barely hit 12k with 370 watts


It's really good but only around 3 to 4fps more than you, so from 370w to 450w there's not a massive difference.


----------



## acoustic

hubsahubsa said:


> Do you know how much power it uses? That score is absolutely nuts to me. I can barely hit 12k with 370 watts


I wouldn't stress it too much. Port Royal scales extremely linearly; so the difference between 12k and 13k is just 2, 3, maybe 5fps max in the test. The difference in real-world usage is even less than that.


----------



## eliwankenobi

Peter Watson said:


> It's really good but only around 3 to 4fps more than you, so from 370w to 450w there's not a massive difference.


Really?

I need to see benchmarks of a fully tuned 3-pin 3980 with a Fully Tuned 2-pin crap 3080 like the XC3 Ultra for comparison.. Will post some scores soon of my XC3 Ultra Hybrid


----------



## Peter Watson

eliwankenobi said:


> Really?
> 
> I need to see benchmarks of a fully tuned 3-pin 3080 with a Fully Tuned 2-pin crap 3080 like the XC3 Ultra for comparison.. Will post some scores soon of my XC3 Ultra Hybrid


Yes really, https://www.3dmark.com/compare/pr/786859/pr/774837

My card with reduced power limit at 360w and at full power limit 447w, I couldn't hit the clocks at 360w that I can when I'm running 450w but it's like 4fps more
811 points difference though in port royal. So 200 points per 1fps


----------



## hubsahubsa

IMO that's a very significant boost in performance, 7%. Kinda want to do a shunt mod now


----------



## Peter Watson

hubsahubsa said:


> IMO that's a very significant boost in performance, 7%. Kinda want to do a shunt mod now


7% boost is not worth the heat and your cards warranty..


----------



## hubsahubsa

Peter Watson said:


> 7% boost is not worth the heat and your cards warranty..


Speak for yourself


----------



## eliwankenobi

Peter Watson said:


> Yes really, https://www.3dmark.com/compare/pr/786859/pr/774837
> 
> My card with reduced power limit at 360w and at full power limit 447w, I couldn't hit the clocks at 360w that I can when I'm running 450w but it's like 4fps more
> 811 points difference though in port royal. So 200 points per 1fps


What card do you have? A shunt modded XC3? Or a XC3 Ultra with the FTW XOC BIOS?


Sent from my iPhone using Tapatalk


----------



## SoldierRBT

0.987v is enough to brake 13k in Port Royal. Power draw: 390W Max temp: 52C









I scored 13 005 in Port Royal


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## ssgwright

SoldierRBT said:


> 0.987v is enough to brake 13k in Port Royal. Power draw: 390W Max temp: 52C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 13 005 in Port Royal
> 
> 
> Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2474603


how did you break 13k with an average clock of 2165? I averaged 2190 and only scored 12,980?


----------



## SoldierRBT

ssgwright said:


> how did you break 13k with an average clock of 2165? I averaged 2190 and only scored 12,980?


Higher memory oc


----------



## ssgwright

SoldierRBT said:


> Higher memory oc


wow.. my cards got a decent core... memory on the other hand... not so good


----------



## Knoxx29

#3,369 
Is it posible? ( XC3 Ultra with the FTW XOC BIOS? )


----------



## Peter Watson

SoldierRBT said:


> 0.987v is enough to brake 13k in Port Royal. Power draw: 390W Max temp: 52C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 13 005 in Port Royal
> 
> 
> Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2474603


Amazing score for such low voltage, I tried with my card but I can't hit them gpu clocks I get a crash on port royal after a few secs, I don't think there are many cards that will hit 2160Mhz at 0.987v on load, Only way for me is water block 😭

Also SoldierRBT have you re-pasted your 3080 card or water cooled it?


----------



## mouacyk

Bykski Water block is here. Can't wait for the +3% performance.


----------



## SoldierRBT

Peter Watson said:


> Amazing score for such low voltage, I tried with my card but I can't hit them gpu clocks I get a crash on port royal after a few secs, I don't think there are many cards that will hit 2160Mhz at 0.987v on load, Only way for me is water block 😭
> 
> Also SoldierRBT have you re-pasted your 3080 card or water cooled it?


I haven't repasted. It's running with stock cooler and good airflow on an open bench. My max score is 13,401 with peak at 2310MHz with 1.075v. Couldn't go higher because it's already hitting PL at that voltage. I was going to buy a waterblock but I purchased a 3090 KPE instead. I may sell this 3080 but haven't decided it yet.


----------



## BluePaint

@SoldierRBT 
That unicorn chip should fetch a nice price. 
Is the KPE 3090 actually faster


----------



## SoldierRBT

@BluePaint 

I tested both cards with daily OC (Offset OC) and 3090 was around 20% faster in 3 games. (RTX Quake 2, Battlefront 2 and Battlefield V). 
3080 +150 Core (2145-2160MHz) +1200 memory (450W)
3090 +120 Core (2130-2145MHz) +1100 memory (520W)
3090 can be 23-25% faster when pulling 650-700W but can't justify 180W more just for 2-3 extra fps.


----------



## mouacyk

SoldierRBT said:


> @BluePaint
> 
> I tested both cards with daily OC (Offset OC) and 3090 was around 20% faster in 3 games. (RTX Quake 2, Battlefront 2 and Battlefield V).
> 3080 +150 Core (2145-2160MHz) +1200 memory (450W)
> 3090 +120 Core (2130-2145MHz) +1100 memory (520W)
> 3090 can be 23-25% faster when pulling 650-700W but can't justify 180W more just for 2-3 extra fps.


It's awesome to see the true ambient margins of 3090 vs 3080. All stock reviews put the advantage at 10%, so the price difference looks insane. 23-25% is easier to swallow.


----------



## SoldierRBT

mouacyk said:


> It's awesome to see the true ambient margins of 3090 vs 3080. All stock reviews put the advantage at 10%, so the price difference looks insane. 23-25% is easier to swallow.


It's because all the reviewers have 370W-400W TDP 3090s which is only 10% faster than a 3080. When you go from 400W to 520W, there's another gap of 10% improvement. At 3440x1440, I see 18-20% with 520W BIOS, it may be higher on 4K. 3090 is good buy if wanna game at 4K and don't mind the high price. perf/price, 3080 is king.


----------



## Warrimonk

I feel like either my 3800X or my cheap RAM (3800 CL19) has been what is holding back my PR scores.. I had 11.6k with my XC3 3080 (2.0ghz core , +600 memory) and now with a Suprim X 3080 ( 2.1Ghz core, +1250 memory, with higher power limit and better cooler), I'm still hitting 11.6-11.7K


----------



## Knoxx29

Would anyone be that kind to answer my previous post? 
Merci


----------



## BluePaint

2x8pin cards are not compatible with 3x8pin cards BIOS
has been mentioned a lot in this thread


----------



## Knoxx29

BluePaint said:


> 2x8pin cards are not compatible with 3x8pin cards BIOS
> has been mentioned a lot in this thread


I missed a few posts, i asked because someone mentioned it


----------



## mouacyk

Warrimonk said:


> I feel like either my 3800X or my cheap RAM (3800 CL19) has been what is holding back my PR scores.. I had 11.6k with my XC3 3080 (2.0ghz core , +600 memory) and now with a Suprim X 3080 ( 2.1Ghz core, +1250 memory, with higher power limit and better cooler), I'm still hitting 11.6-11.7K


Monitor your CPU and GPU usage with MSI Afterburner OSD. My guess is that a RAM bottleneck will turn into a CPU bottleneck and you should see CPU usage at 99% and GPU less than that.


----------



## zlatanselvic

Any bios recommendations for the suprim X?


----------



## cennis

SoldierRBT said:


> I haven't repasted. It's running with stock cooler and good airflow on an open bench. My max score is 13,401 with peak at 2310MHz with 1.075v. Couldn't go higher because it's already hitting PL at that voltage. I was going to buy a waterblock but I purchased a 3090 KPE instead. I may sell this 3080 but haven't decided it yet.


I assume this is an EVGA FTW3?


----------



## SoldierRBT

@cennis 
Yes


----------



## Peter Watson

Warrimonk said:


> I feel like either my 3800X or my cheap RAM (3800 CL19) has been what is holding back my PR scores.. I had 11.6k with my XC3 3080 (2.0ghz core , +600 memory) and now with a Suprim X 3080 ( 2.1Ghz core, +1250 memory, with higher power limit and better cooler), I'm still hitting 11.6-11.7K


Are you not losing fps with the mem that high, if I run anything over +1050 I start to lose serious fps.


----------



## obscurehifi

Warrimonk said:


> I feel like either my 3800X or my cheap RAM (3800 CL19) has been what is holding back my PR scores.. I had 11.6k with my XC3 3080 (2.0ghz core , +600 memory) and now with a Suprim X 3080 ( 2.1Ghz core, +1250 memory, with higher power limit and better cooler), I'm still hitting 11.6-11.7K


One thing I noticed is that my Aorus Xtreme Waterforce actually runs benchmarks decently all the way to +1500 (22,000) but doesn't play nicely with the core clock. This is because they both pull from the limited power cap I have of 375W. I tested this by reducing the mem clock to (-500) 18,000 and sure enough, my core clock boosted higher than I had been seeing. My card, when heavily loaded, stays around 2,000 to 2,050 on the core clock. By reducing the memory to 18,000, the core clock started going well above 2,100. It seems to be a balance, some games and benchmarks that like mem clock, I can do better around +1200 to +1500. Other situations that do better on core clock increases need the mem clock reduced (to get higher core clock) because they both suck on limited power. 

Sent from my SM-G973U using Tapatalk


----------



## obscurehifi

Also, I'm unsure why, but MSI Afterburner uses half the memory clock number than the AORUS Engine tool uses. Kind of silly. +1500 on AB is equivalent to +3000 on AE. My memory clock default is 19,000 and when using both the settings mentioned above, yield the same 22,000 speed. So technically, AB uses half the correct number. Not sure why that is! 

Sent from my SM-G973U using Tapatalk


----------



## Warrimonk

Peter Watson said:


> Are you not losing fps with the mem that high, if I run anything over +1050 I start to lose serious fps.


Figured it out. The problem was with my actual DRAM. I reset it from 3600 CL19 to 3200 CL16 and my score shot up to 12060


----------



## SPL Tech

Warrimonk said:


> Figured it out. The problem was with my actual DRAM. I reset it from 3600 CL19 to 3200 CL16 and my score shot up to 12060


Sounds like a fake benchmark then. Games dont see FPS advantages with faster RAM. Everyone knows this. The difference between 2133 Mhz and 5200 Mhz RAM is like 1 FPS in gaming. RAM speed plays nearly zero role in gaming.


----------



## SoldierRBT

SPL Tech said:


> Sounds like a fake benchmark then. Games dont see FPS advantages with faster RAM. Everyone knows this. The difference between 2133 Mhz and 5200 Mhz RAM is like 1 FPS in gaming. RAM speed plays nearly zero role in gaming.


This is a joke, right?


----------



## Tergon123

SoldierRBT said:


> This is a joke, right?


WOW


----------



## Falkentyne

SPL Tech said:


> Sounds like a fake benchmark then. Games dont see FPS advantages with faster RAM. Everyone knows this. The difference between 2133 Mhz and 5200 Mhz RAM is like 1 FPS in gaming. RAM speed plays nearly zero role in gaming.


This isn't true.
Fortnite has massive gains going from 3200 CL18 to 4400 CL16.


----------



## Falkentyne

BTW Proof of testing done:






PCBuilding


KingFaris10's Site




kingfaris.co.uk


----------



## SPL Tech

Falkentyne said:


> This isn't true.
> Fortnite has massive gains going from 3200 CL18 to 4400 CL16.
> 
> View attachment 2474762

























Don't waste money chasing RAM speed for gaming on AMD or Intel


Revisiting the question that every gamer considers when building a new gaming system: does RAM speed matter?




www.pcgamer.com





It dident matter in any of these benchmarks. Also there is a difference between running 400 FPS at 1080P and low settings vs running 4k on all max settings. On 4k systems it matters even less.


----------



## Falkentyne

SPL Tech said:


> Don't waste money chasing RAM speed for gaming on AMD or Intel
> 
> 
> Revisiting the question that every gamer considers when building a new gaming system: does RAM speed matter?
> 
> 
> 
> 
> www.pcgamer.com
> 
> 
> 
> 
> 
> It dident matter in any of these benchmarks. Also there is a difference between running 400 FPS at 1080P and low settings vs running 4k on all max settings. On 4k systems it matters even less.


You do know who you're talking to, right?....
Don't preach to the choir. Sending me videos from beginners is pretty useless.









Those videos are garbage.
None of those people know how to properly tune RAM.
They're techtubers.
The closest person who knows about tuning is Buildzoid.
Even Steve @ Gamersnexus admitted that he has no time to spend tuning RAM because it's an extremely tedious, often multi-day thing, far more time consuming than tuning CPU.
We tune RAM in the overclocking discord.


----------



## xc3_320w

/must resist replying with anything non-serious!!!


----------



## eliwankenobi

I will add that RAM speed and timings do matter... a lot! 


Sent from my iPhone using Tapatalk


----------



## acoustic

I saw a jump in performance just tweaking my 3200CL15 to 3800CL15 with some small tweaked tertiary timings. This is on 3840x1600 with a 3080. AC: Valhalla seen an almost 5 FPS average increase with a 9900K @ 5Ghz.


----------



## mouacyk

While these scores are good for non-shunted PL and voltage, I can't help but think there's something wrong with my Bykski Eagle OC waterblock yet, maxing at 54C. My 1080TI at 2100MHz and 1.1v maxed at 38C. I know I didn't use any washers, so may be tension is off.


















I've never had a waterblock perform this bad, so I'm positive I've made a mistake somewhere. Even my 980TI (EK block) ran under 40c with 240+120 rads at 1493MHz and 1.2v. I see on reddit and amazon reviews regarding exact or similar Bykski blocks where people are reporting up to 50C temps, but I refuse to accept that as normal, after living with <40C for two generations at 20C ambient.


----------



## Knoxx29

mouacyk said:


> While these scores are good for non-shunted PL and voltage, I can't help but think there's something wrong with my Bykski Eagle OC waterblock yet, maxing at 54C. My 1080TI at 2100MHz and 1.1v maxed at 38C. I know I didn't use any washers, so may be tension is off.


i swear that i saw those Screenshots somewhere else


----------



## mouacyk

Knoxx29 said:


> i swear that i saw those Screenshots somewhere else


They are your worst nightmare...


----------



## Peter Watson

Te1


----------



## obscurehifi

mouacyk said:


> While these scores are good for non-shunted PL and voltage, I can't help but think there's something wrong with my Bykski Eagle OC waterblock yet, maxing at 54C. My 1080TI at 2100MHz and 1.1v maxed at 38C. I know I didn't use any washers, so may be tension is off.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've never had a waterblock perform this bad, so I'm positive I've made a mistake somewhere. Even my 980TI (EK block) ran under 40c with 240+120 rads at 1493MHz and 1.2v. I see on reddit and amazon reviews regarding exact or similar Bykski blocks where people are reporting up to 50C temps, but I refuse to accept that as normal, after living with QUOTE]




Actually, those temps seem sort of okay. Looking at an hour gaming session, mine hit constant 59C with fans at 54%. Aorus 3080 Waterforce with stock 375w bios. However, in my 3DMark benchmarks, my average temps are usually 39 to low 40's with the fans at 100%.

Are the temps you're seeing after a 2 minute bench? If so, do you have the fans maxed? 

Sent from my SM-G973U using Tapatalk


----------



## mouacyk

@obscurehifi Yes, 54C is after 2 min bench runs. I've tried CBP2077 for 15+ minutes and it easily reaches 57C, which is not that great coming from the stock Eagle OC cooler, in my sample reaching only low 60's. Haven't done a DDU and driver re-install either after swapping out my 1080TI, so who knows... will have to try the following things first:

1) DDU and driver re-install
2) Re-inspect the die for proper LM contact and apply washers on re-assembly
3) Turn up D5 pump to full speed, it's at 3700RPM (of 4700RPM)
4) Sulk in defeat


----------



## chiknnwatrmln

Finally got my hands on a 3080!!

Should be quite an upgrade from my 1080, and temporary 3070. Just need to get rid of my water cooling to fit it...


----------



## obscurehifi

mouacyk said:


> @obscurehifi Yes, 54C is after 2 min bench runs. I've tried CBP2077 for 15+ minutes and it easily reaches 57C, which is not that great coming from the stock Eagle OC cooler, in my sample reaching only low 60's. Haven't done a DDU and driver re-install either after swapping out my 1080TI, so who knows... will have to try the following things first:
> 
> 1) DDU and driver re-install
> 2) Re-inspect the die for proper LM contact and apply washers on re-assembly
> 3) Turn up D5 pump to full speed, it's at 3700RPM (of 4700RPM)
> 4) Sulk in defeat


What temperature do you achieve before your 2 min bench runs? I usually wait a couple minutes between runs and my temp goes down into the 20's I'm pretty sure. Again, that's with my fans maxed but that's just because I can maintain higher clock for benchmark runs and it could down quickly between runs. 

Sent from my SM-G973U using Tapatalk


----------



## cennis

hubsahubsa said:


> My Eagle OC with a Gaming OC bios went up to 370 W no problem


Which version? My GB gaming OC card on stock bios only pulls <350W


----------



## SPL Tech

I am thinking all of the cards are limited to 330W regardless of what people claim. I havent seen a single GPU-Z screenshot of someone pulling more than 330W on their card.


----------



## blurp

False claim. I reach >400W easily on my 3080 FTW with 450W BIOS. I underclocked to top at 300w 875 mv 60C. 1860MHz.


----------



## DaftConspiracy

SPL Tech said:


> I am thinking all of the cards are limited to 330W regardless of what people claim. I havent seen a single GPU-Z screenshot of someone pulling more than 330W on their card.


I blame Nvidia for that, I honestly think it's a driver level restriction to try and push their stupid 12 pin connector. Not sure if anyone else has noticed that the FE cards (which are still powered by 2*8 pins even if they have a stupid adapter) have no issue exceeding 450w. It's the only reason that every 2*8 pin card would be limited to 350w and every 3*8 pin would be limited to 450w that I can think of.


----------



## DaftConspiracy

Anyone figure out how to lock in voltage on these? The highest I can lock in is 1.081v, and that's only by altering the stock curve which completely butchers performance as soon as the card hits power limit. If I set a proper curve offset it'll only maintain 1.06v. I can either lock in 2130mhz at 1.081v and suffer if I hit power limit, or offset the stock curve which limits me to 2085mhz but won't tank as hard at power limit.


----------



## obscurehifi

DaftConspiracy said:


> Anyone figure out how to lock in voltage on these? The highest I can lock in is 1.081v, and that's only by altering the stock curve which completely butchers performance as soon as the card hits power limit. If I set a proper curve offset it'll only maintain 1.06v. I can either lock in 2130mhz at 1.081v and suffer if I hit power limit, or offset the stock curve which limits me to 2085mhz but won't tank as hard at power limit.


A lot of people on here are achieving their overclocking by undervolting, rather than trying to lock in high voltages. My card gets it's best sustained clock when it's between 950 and 970mV for benchmarks (370W limit). Voltage and frequency both create heat, but at different rates. Decreasing voltage allows for higher frequency when capped by a power limit. 

Sent from my SM-G973U using Tapatalk


----------



## obscurehifi

SPL Tech said:


> I am thinking all of the cards are limited to 330W regardless of what people claim. I havent seen a single GPU-Z screenshot of someone pulling more than 330W on their card.


This isn't accurate. My 370W card pulls sustained 360W readings all day in GPUZ. I've seen many screen shots of other people's readings on this thread higher than 330. Perhaps you're referring to people that shunt mod their card to fool the card into thinking it's pulling less power, in which case you have to calculate what the actual power is, but even then they are probably using 370W bioses and GPUZ would be showing 370 instead of 450+. 

Sent from my SM-G973U using Tapatalk


----------



## BluemoonRisen

So as a ZOTAC 3080 Trinity owner, i am really excited about the Zotac 3080 AMP Extreme Holo BIOS, because that card has 2x8 Pin connectors aswell.

I dont have too much info but i hope this BIOS can unlock the Powerlimit from 330W to 375W finally or do you think the Trinity is PCB locked at 330W ?


----------



## DaftConspiracy

obscurehifi said:


> This isn't accurate. My 370W card pulls sustained 360W readings all day in GPUZ. I've seen many screen shots of other people's readings on this thread higher than 330. Perhaps you're referring to people that shunt mod their card to fool the card into thinking it's pulling less power, in which case you have to calculate what the actual power is, but even then they are probably using 370W bioses and GPUZ would be showing 370 instead of 450+.
> 
> Sent from my SM-G973U using Tapatalk


My card was a 375w card but it'd only ever read a peak of 360w in gpuz, but it was so brief that it would only register as a peak of 350 in hwinfo. If it's only consuming that much for less than than polling rate of hwinfo I'd hardly call it a raised limit. I've shunt modded mine now so it'll draw up to about 450w sustained, so locking it at 2130mhz at 1.081v does work most of the time. I just wish there was a way to force it to boost up to that without lowering the rest of the curve.


----------



## mouacyk

Looks like adding another drop of LM and using washers did increase the die contact on my Bykski block. 47C vs 54C now in quick benchmark. Still 7C higher than 1080TI, but that's likely due to the higher power and denser die of Ampere.


----------



## DaftConspiracy

mouacyk said:


> Looks like adding another drop of LM and using washers did increase the die contact on my Bykski block. 47C vs 54C now in quick benchmark. Still 7C higher than 1080TI, but that's likely due to the higher power and denser die of Ampere.
> 
> View attachment 2474972


The standoffs on the block are likely a hair too tall, preventing the block from making good contact with the die. Had that issue with the Hydro Copper block on my 2080s until I ground them down a touch. (Assuming you're not limited by other variables such as radiators)


----------



## mouacyk

DaftConspiracy said:


> The standoffs on the block are likely a hair too tall, preventing the block from making good contact with the die. Had that issue with the Hydro Copper block on my 2080s until I ground them down a touch. (Assuming you're not limited by other variables such as radiators)


Thanks for the additional things I can look into. I have an Swiftech 120mm + GTX360mm, so might be reaching capacity at 360w (GPU) + ~150W (CPU). Will have to review my photos to see if the standoffs might be an issue.


----------



## xc3_320w

SPL Tech said:


> I am thinking all of the *2x8pin reference design *cards are limited to 330W regardless of what people claim.


Fixed.


----------



## hubsahubsa

I just found out that me having a non-stock BIOS on my 3080 actually wasn't the reason why it idled at 100 watts. The reason was newest drivers. Installed 460.89 and now it idles normally at 20 watts.


----------



## mouacyk

hubsahubsa said:


> I just found out that me having a non-stock BIOS on my 3080 actually wasn't the reason why it idled at 100 watts. The reason was newest drivers. Installed 460.89 and now it idles normally at 20 watts.


I'm using the latest driver and have Gaming OC BIOS on my Eagle OC card and it idles around 20W. You probably set "Performance Mode" in NVidia Control panel and forgot about it. That always keeps idle clocks at 1800MHz.


----------



## hubsahubsa

mouacyk said:


> I'm using the latest driver and have Gaming OC BIOS on my Eagle OC card and it idles around 20W. You probably set "Performance Mode" in NVidia Control panel and forgot about it. That always keeps idle clocks at 1800MHz.


No, that setting doesn't do anything that i can notice for me



SPL Tech said:


> I am thinking all of the cards are limited to 330W regardless of what people claim. I havent seen a single GPU-Z screenshot of someone pulling more than 330W on their card.


3080 Eagle OC with a Gaming OC bios in Witcher 3. Highest peak i've seen in GPU-Z has actually been over 370 watts even though the bios limit is 370 W, i think it was 374 W in Fallout 4.


----------



## Colonel_Klinck

Struggling to get past 12824 in Port Royal with my TUF OC









I scored 12 824 in Port Royal


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## obscurehifi

mouacyk said:


> I'm using the latest driver and have Gaming OC BIOS on my Eagle OC card and it idles around 20W. You probably set "Performance Mode" in NVidia Control panel and forgot about it. That always keeps idle clocks at 1800MHz.


I believe all this setting does is to allow the maximum amount of power through the pci-e slot, which I believe is 70 or 75W. I've had my card idle ay a high frequency before after playing with overclocking and just had to reboot for it to idle down. My hunch is something gets stuck software/driver-wise that the reboot fixes. 

Sent from my SM-G973U using Tapatalk


----------



## obscurehifi

Colonel_Klinck said:


> Struggling to get past 12824 in Port Royal with my TUF OC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 824 in Port Royal
> 
> 
> Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


You're lucky then, I can't seem to get past 12330. I have 32 GB of fast system memory on the way that I'm hoping will help this. Fingers crossed. 

Sent from my SM-G973U using Tapatalk


----------



## Peter Watson

Colonel_Klinck said:


> Struggling to get past 12824 in Port Royal with my TUF OC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 824 in Port Royal
> 
> 
> Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


I'm in the same boat with my ftw3.. https://www.3dmark.com/pr/795158


----------



## Colonel_Klinck

Peter Watson said:


> I'm in the same boat with my ftw3.. https://www.3dmark.com/pr/795158


Wow you've got a fantastic 9900k there. I could never get mine past 5.2 and it was terrible at memory overclocking. Couldn't get my g-skill 3600 CL16 past 3800 CL16. This 10900k with the same ram is 4200 CL16 and rock solid stable. I could probably push it further in frequency and timings.


----------



## DaftConspiracy

Anyone try flashing a 3090 bios to a 3080? Or had success actually increasing power limit by flashing a 3 connector bios to a 2 connector card?


----------



## mouacyk

@Colonel_Klinck and @Peter Watson You guys shunted for those PR runs? I'm at my PL of 360W likely cannot progress beyond 12.2K. Haven't tried above 0.95v yet though.



DaftConspiracy said:


> Anyone try flashing a 3090 bios to a 3080? Or had success actually increasing power limit by flashing a 3 connector bios to a 2 connector card?


Best for someone with dual BIOS to try. Guessing the 3-pin to 2-pin flash will work, but at 66% power draw on pins reducing the total 450W possible to 297W, so useless. 3090 to 3080 flash likely will be blocked in NVFlash, unless someone modded it.


----------



## man from atlantis

Good news to Palit GameRock owners who are looking for liquid cooler options, Alphacool is preparing to launch a block for the 3080/3090 GameRock series.






'Gainward Phantom, Phantom GS, Palit GameRock, GameRock OC RTX 3090, 3080 waterblocks'


When searching on web on blocks for Gamerock/Gamerock OC/Phantom/Phantom GS 3080/3090, among returned results norm was "we won't make block", also found this thread. Seems somewhat more hopeful vs "we won't/no plans" for all the other vendors. Registered to say to count me in among buyers. It...




forum.alphacool.com


----------



## Colonel_Klinck

mouacyk said:


> @Colonel_Klinck and @Peter Watson You guys shunted for those PR runs? I'm at my PL of 360W likely cannot progress beyond 12.2K. Haven't tried above 0.95v yet though.



Yeah I have R008 stacked on the stock. Tempted to try the R005 but don't want to overload the PCI/24pin.


----------



## Peter Watson

man from atlantis said:


> Good news to Palit GameRock owners who are looking for liquid cooler options, Alphacool is preparing to launch a block for the 3080/3090 GameRock series.
> 
> 
> 
> 
> 
> 
> 'Gainward Phantom, Phantom GS, Palit GameRock, GameRock OC RTX 3090, 3080 waterblocks'
> 
> 
> When searching on web on blocks for Gamerock/Gamerock OC/Phantom/Phantom GS 3080/3090, among returned results norm was "we won't make block", also found this thread. Seems somewhat more hopeful vs "we won't/no plans" for all the other vendors. Registered to say to count me in among buyers. It...
> 
> 
> 
> 
> forum.alphacool.com


I was looking at 3090 gamerocks and saw these where in stock in the uk, but no waterblocks, but it's not a sure thing Alphacool will make them. if they confirm waterblocks I will defo get a 3090 gamerock..


----------



## Peter Watson

Colonel_Klinck said:


> Wow you've got a fantastic 9900k there. I could never get mine past 5.2 and it was terrible at memory overclocking. Couldn't get my g-skill 3600 CL16 past 3800 CL16. This 10900k with the same ram is 4200 CL16 and rock solid stable. I could probably push it further in frequency and timings.


It's nothing special tbh I have very good water cooling, I delided it and lapped it. I only take it to 5.5 and 5.6 for benchmarks it takes 1.55v and llc turbo to do 5.6ghz lol so it doesn't stay at them clocks for long. I just run it at 5.1ghz for normal use.


----------



## Peter Watson

Colonel_Klinck said:


> Yeah I have R008 stacked on the stock. Tempted to try the R005 but don't want to overload the PCI/24pin.


No shunt for me, but my cpu has a fairly high overclock, for me I've found getting my single core performance close to the i9 10900k helps alot in port royal.


----------



## KBDE

The 3080 FE bios might also be interesting for 2x8pin card owners since it has decent power limits and it's essentially a 2x8pin card (even though it has that 12pin connector). But yeah, we need a patched nvflash for that.


----------



## Colonel_Klinck

Peter Watson said:


> No shunt for me, but my cpu has a fairly high overclock, for me I've found getting my single core performance close to the i9 10900k helps alot in port royal.


Are you running just single core at 5.5 for Port Royal?


----------



## Peter Watson

Colonel_Klinck said:


> Are you running just single core at 5.5 for Port Royal?


5.5 all core I don't know how to do just single core lol..


----------



## eliwankenobi

KBDE said:


> The 3080 FE bios might also be interesting for 2x8pin card owners since it has decent power limits and it's essentially a 2x8pin card (even though it has that 12pin connector). But yeah, we need a patched nvflash for that.
> 
> View attachment 2475057


Amazing if it could be possible! I’ve heard trying NVflash a FE BIOS on even a ref board card will give the A/non-A error. 


Sent from my iPhone using Tapatalk


----------



## obscurehifi

hubsahubsa said:


> I just found out that me having a non-stock BIOS on my 3080 actually wasn't the reason why it idled at 100 watts. The reason was newest drivers. Installed 460.89 and now it idles normally at 20 watts.





mouacyk said:


> I'm using the latest driver and have Gaming OC BIOS on my Eagle OC card and it idles around 20W. You probably set "Performance Mode" in NVidia Control panel and forgot about it. That always keeps idle clocks at 1800MHz.


I just realized I had the same issue too! I had it in the past but thought it had been fixed.

I just saw that I was idling at ~1800Mhz on Nvidia v461.09. I tried several things but the only thing that allowed it to idle down was setting the Nvidia "Power Management Mode" back to normal AND rebooting the computer. Hitting apply without rebooting doesn't change this. You have to reboot. I went from ~1800Mhz and 100W to 210Mhz and 25-30W. Not sure if this setting actually affects performance under load but it DEFINITELY affects the idle. My PCIe Slot power is now reported as around 7W idling. So if I was idling at 100W before, this has more to do than just the power from the PCIe slot as it must have been pulling more power from the 8-pin powers to reach 100W.


----------



## obscurehifi

SPL Tech said:


> I am thinking all of the cards are limited to 330W regardless of what people claim. I havent seen a single GPU-Z screenshot of someone pulling more than 330W on their card.


Alright, I had some time to do some runs, this is the result of a PortRoyal run. Here's the GPU-Z with the max board power draw at 367.8W. You can see there some peaks, so the next shot shows the cursor over one of the lower sections, showing 356W. I have this set to 500ms polling frequency.
https://www.3dmark.com/3dm/57079115
This is about 100 points off my highest PR run. I had the Aorus Engine set to +150 (1,995Mhz) on core clock at memory clock set to 19,802Mhz.

















Screenshot when cursor was over one of the lower values shown.









During the same run, Open Hardware Monitor, which I think polls every three seconds shows this:









Here's the shot of the bios, which shows it as a 370W bios:









I conversed with a Gigabyte engineer that said this information actually comes from the Nvidia driver and is not completely accurate. They have current sensors on the actual wires/traces and say that it will actually pulls 370W but down just one step but they didn't say what value that step was. I'm guessing it's 5 watts or less.

Edit: When I say "conversed with a Gigabyte engineer," it was actually through a support agent that was relaying the information the engineer(s) responded with to me. I was asking why my card wasn't drawing 370W. The engineer said to overclock the card to make sure it's pulling all the power. Also, they said overclocking the Gigabyte cards do not void the warranties.


----------



## ducegt

I haven't had much time to enjoy my 3080 Gaming X Trio Flashed with the Suprim BIOS (+150\+500), but I've had issue with my system unexpectedly powering off. Well, the power LED remains lit, but one press of the button completely powers it off. Event 41 Kernel Power in the logs. Strange enough, this never happened in any games or while benching, but occurs when just web browsing.

My RM850x is about 5 years old. I had been using two 6+2pins on the 3-slot card, but just changed it to dedicated cables for each slot. Neither slot drops below 11.9v under load. Never had this issue with my Vega 64 LC pulling close to 400w so fairly confident it's the 3080.


----------



## SPL Tech

obscurehifi said:


> Also, they said overclocking the Gigabyte cards do not void the warranties.


Well that is a meaningless phrase as no one can tell if you overclocked a card or not. So it wont void the warranty becasue they have no way of knowing either way.


----------



## Imprezzion

Well, I might be in luck. I check stock in local webshops regularly and lo and behold a shop had the MSI Suprim X 3080 in stock for a relatively normal price as well.. went ahead and ordered one lol. Let's hope I was on time this time and that I actually get it!

Gotta ask tho, is the Suprim X's cooler good enough to sustain proper overclocks without all too much temperature "throttling" or should I slap my Kraken G12 + X52 on it just like I have on my 2080 Ti now? Is that worth it?

And how's the BIOS situation on the Suprim X? Is there any BIOS that will give me more power limit or something useful on the Suprim X or is that not available (yet).


----------



## edhutner

Suprim X has two bioses. BIOS switch position - gaming is the one with high power limit. I use my at 110%power (it may go up to 116%) and +30% volt, +100 clock, +500 mem. In gaming it peaks to about 370-400W, temperature is about 70-72C with fans about 80%

I dont think that you will need to flash another bios, because with stock gaming bios max limit is 430W.


----------



## Imprezzion

Oh yeah 430w should be plenty. That does sound quite hot tho. I'm used to MSI's coolers being quite capable but by the sound of those temps I might just put the Kraken X52 + V12 on it. Oh well, we'll see if I get a tracking code later today and of it actually arrives or if I missed it, again, and have to wait ages for it hehe.


----------



## fraefm95

Hey guys,

I'm using a rtx 3080 ftw3 ultra with byksbi waterblock.

Put it on LM and EK CE 560mm radiator with 8 fans on push/pull.

Able to overclock +120/1200.

Below my Port Royal score.

Temps never exceed 40°C


----------



## fraefm95

BUT my 3080 never exceed 420w.

Dont know why it doesnt reach 450w.


----------



## DaftConspiracy

KBDE said:


> The 3080 FE bios might also be interesting for 2x8pin card owners since it has decent power limits and it's essentially a 2x8pin card (even though it has that 12pin connector). But yeah, we need a patched nvflash for that.
> 
> View attachment 2475057


Oh so those 3 resistors _do _have to be shunted as well! Damn it, guess I have to order more resistors. Well good to know what's holding me back anyway.


----------



## DaftConspiracy

Also discovered that older bios permit up to 1.1v sustained under load, while newer ones I've tried only allow up to 1.075v sustained under load. 1.1v let me lock in 2145mhz while gaming, before I was only able to lock in 2100mhz with occasional bursts to 2115mhz.


----------



## acoustic

fraefm95 said:


> Hey guys,
> 
> I'm using a rtx 3080 ftw3 ultra with byksbi waterblock.
> 
> Put it on LM and EK CE 560mm radiator with 8 fans on push/pull.
> 
> Able to overclock +120/1200.
> 
> Below my Port Royal score.
> 
> Temps never exceed 40°C


Something doesn't seem right about your score. For me, 3080 FTW3 Ultra on Hybrid cooler (2x Noctua A12x25 as intake) on 450watt BIOS, core voltage @ 100%, +120/+600 (might have been +550, don't remember) gave me my PB of 12840. Link: https://www.3dmark.com/pr/799367

I would try dropping your memory down. I think you're hitting some memory correction. There's no way you should be pulling lower scores at lower temperatures. I hit 44-45c during the run.


----------



## DaftConspiracy

acoustic said:


> Something doesn't seem right about your score. For me, 3080 FTW3 Ultra on Hybrid cooler (2x Noctua A12x25 as intake) on 450watt BIOS, core voltage @ 100%, +120/+600 (might have been +550, don't remember) gave me my PB of 12840. Link: https://www.3dmark.com/pr/799367
> 
> I would try dropping your memory down. I think you're hitting some memory correction. There's no way you should be pulling lower scores at lower temperatures. I hit 44-45c during the run.


I've noticed that I can run Port Royal at higher core clock without crashing than I can in time spy, but if I do my score drops. I recommend using time spy to find your max "stable" OC, then running Port Royal. That said I think he still scored slightly higher than me and my cards got nothing left to give without voltage mods.


----------



## acoustic

DaftConspiracy said:


> I've noticed that I can run Port Royal at higher core clock without crashing than I can in time spy, but if I do my score drops. I recommend using time spy to find your max "stable" OC, then running Port Royal. That said I think he still scored slightly higher than me and my cards got nothing left to give without voltage mods.


Yeah, I can't run +120 in Time Spy or Fire Strike either. I can't run those clocks in games either; I run +45/+500 for 24/7 use, and some games like CP'77 had some weird issues until I went to +30/+500 due to temps. With the Hybrid cooler though, I got back to +45.


----------



## syncro2020

Here are my best scores with Asus Strix RTX 3080 OC with EK WaterBlock, i9-9900k Running at 5GHZ

Timespy:








I scored 18 392 in Time Spy


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





TimeSpy Extreme:








Result not found







www.3dmark.com





PortRoyal:








I scored 12 802 in Port Royal


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





SuperPosition:










Im still in top 60-70 i think, those test were done in December, before i sold my rig


----------



## syncro2020

Here is my current MSI RTX 3080 Gaming X Trio stock air cooler with MSI Supreme Bios Flashed (MSI RTX 3080 VBIOS) , Max PL/Voltage, +100 GPU clock/ +900 Memory

Time Spy:








Result not found







www.3dmark.com





Time Spy Extreme:








Result not found







www.3dmark.com





Port Royal:








Result not found







www.3dmark.com





SuperPosition:


----------



## BluePaint

Hmmm, in my case, for benching it doesn't seem to be worth getting a water block, lol. 
I am in that list on 56 with 20.086 TS. That's with air and like 10C ambient. 
Getting a > 5Ghz Intel + 4400 RAM instead of 5800X & 4066 RAM would probably give more GPU points in TS.


----------



## DaftConspiracy

Here's my current best Time Spy score for the GPU, of course windows decided to run something durring the cpu test so overall score took a hit. (Actually one of my worst cpu results) http://www.3dmark.com/spy/17605300

And Port Royal








I scored 12 535 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





You'll notice this run I was able to lock in 2160mhz but score actually went down








I scored 12 518 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





I'll try some more now that I have the 1.1v bios


----------



## syncro2020

BluePaint said:


> Hmmm, in my case, for benching it doesn't seem to be worth getting a water block, lol.
> I am in that list on 56 with 20.086 TS. That's with air and like 10C ambient.
> Getting a > 5Ghz Intel + 4400 RAM instead of 5800X & 4066 RAM would probably give more GPU points in TS.


Woow, thats impressive card you have, doing that much on AIR, congrats on the lottery


----------



## fraefm95

acoustic said:


> Something doesn't seem right about your score. For me, 3080 FTW3 Ultra on Hybrid cooler (2x Noctua A12x25 as intake) on 450watt BIOS, core voltage @ 100%, +120/+600 (might have been +550, don't remember) gave me my PB of 12840. Link: https://www.3dmark.com/pr/799367
> 
> I would try dropping your memory down. I think you're hitting some memory correction. There's no way you should be pulling lower scores at lower temperatures. I hit 44-45c during the run.


If i drop my memory to +1000 the score changes to 12564.

Perhaps its about the processor.

I have a 9900k and you a 10900k


----------



## fraefm95

DaftConspiracy said:


> I've noticed that I can run Port Royal at higher core clock without crashing than I can in time spy, but if I do my score drops. I recommend using time spy to find your max "stable" OC, then running Port Royal. That said I think he still scored slightly higher than me and my cards got nothing left to give without voltage mods.


Why Time Spy? Extreme or normal?

Tried with higher overclock and got this:


----------



## syncro2020

> You'll notice this run I was able to lock in 2160mhz but score actually went down
> 
> http://www.3dmark.com/pr/799312


http://www.3dmark.com/pr/799312

Thats nice it can boost to 2160, my asus was only able to go up to 2130 , anything after it would crash even with fully custom tuned curve.
MSI Trio the hiest boost i've been able to get was 2070 i think, definitely worst card out of the 2 i had


----------



## Imprezzion

Typical.. shop just e-mailed me. Not going to receive it. Was already gone by the time my payment went through.... 

I asked them if they expected any other model 3080 in the next 3 days but I strongly doubt it..


----------



## fraefm95

I'll stop playing with clocks now.

But was able to achieve 12823 on PR: https://www.3dmark.com/3dm/57116076?

With +140/+1300 and max temps of 40°C


----------



## mouacyk

syncro2020 said:


> Here are my best scores with Asus Strix RTX 3080 OC with EK WaterBlock, i9-9900k Running at 5GHZ
> 
> Timespy:
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 392 in Time Spy
> 
> 
> Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


Based on your timespy score, the 3rd pin on 3080 alone is worth about 10% more performance (of course without shunt mod).


----------



## chiknnwatrmln

How hard are you guys pushing your cards?

I have an MSI Trio with the Suprim BIOS, set to 380w max TDP.

That's already a little uncomfortable for me so I don't want to go further. Sits around 2030mhz when gaming, under 70c.


----------



## acoustic

fraefm95 said:


> If i drop my memory to +1000 the score changes to 12564.
> 
> Perhaps its about the processor.
> 
> I have a 9900k and you a 10900k


I had a 9900K @ 5Ghz / 4.7Ghz cache, and pulled a 12826. I just recently grabbed the 10900K. For the 12840, I think my ambient temp was a tad lower and that allowed me one extra bin so I got a very small performance boost. Either that, or different driver.

12826 w/ 9900K proof: https://www.3dmark.com/pr/693004

Looking at your score link for the 12823, even though you're at +140, you're clock and average clock speeds are lower than mine, even though we have identical cards on the same 450watt XOC bios. I'm not entirely sure why that is.

CPU doesn't matter very much with Port Royal. I see you hit 12823 though, now that's impressive! I honestly don't know but my card loves the higher voltages. If I try to undervolt like some are doing, I can't do anything. I can only do 1995Mhz @ 1.018v, and my memory is terrible, I can only get to +600 (when it's cold, usually +550) before error correction. When you crank the voltage slider up, my card goes nuts. I would really like to get it under a full waterblock, especially to help my mem OC, but the Hybrid does a pretty good job it seems. I'm competing against a lot of you guys with full blocks with custom loops. I'm not even running push-pull, just two Noctua A12x25s.

Either way, 12823 is a nice score! grats


----------



## Knoxx29

chiknnwatrmln said:


> That's already a little uncomfortable for me so I don't want to go further. Sits around 2030mhz when gaming, under 70c.


That's a nice Clock speed, enjoy it.
I really love to overclock my cards but this time i pass, not going to mess with a 800€ card.


----------



## mouacyk

Knoxx29 said:


> That's a nice Clock speed, enjoy it.
> I really love to overclock my cards but this time i pass, not going to mess with a 800€ card.


Especially one that's in short supply, right?


----------



## Knoxx29

mouacyk said:


> Especially one that's in short supply, right?


Totally agree.
I enjoy it as it's Underclocked at 1800mhz 0.812v


----------



## chiknnwatrmln

Knoxx29 said:


> That's a nice Clock speed, enjoy it.
> I really love to overclock my cards but this time i pass, not going to mess with a 800€ card.


For sure, stock for stock I saw a solid 30% gain from a 3070FE to this card.

Dealing with a 400w space heater in my case is another issue though


----------



## syncro2020

mouacyk said:


> Based on your timespy score, the 3rd pin on 3080 alone is worth about 10% more performance (of course without shunt mod).


No shunt mods were done on that one


----------



## syncro2020

chiknnwatrmln said:


> How hard are you guys pushing your cards?
> 
> I have an MSI Trio with the Suprim BIOS, set to 380w max TDP.
> 
> That's already a little uncomfortable for me so I don't want to go further. Sits around 2030mhz when gaming, under 70c.


I max out PL on my trio with supreme gaming bios (430w limit), no issue, and why would there be any? VMR design is not the best but can handle it, kinda lol


----------



## chiknnwatrmln

syncro2020 said:


> I max out PL on my trio with supreme gaming bios (430w limit), no issue, and why would there be any? VMR design is not the best but can handle it, kinda lol


Not sure, I've never had a video card pull that kinda juice so it just makes me iffy. Never seen more than 300w for one card.

Then again, I guess if temps are in check I'm good to go.


----------



## blackzaru

Spent the afternoon "feeling out" my 3080 with incremental offsets to get an idea of its performance before I do a manual curve. Here are the results!
























































For the obvious questions to pop up:

Why the comparison to +150 core +600 mem? It's what I was running at in most games, as my 24/7 settings, until I take the time to manually set a 24/7 curve.

Specs and settings? Watercooled Asus Tuf 3080 and 5950X (dual 360mm EKWB XE radiators), on an Asus X570 Dark Hero, with memory running at 3733MHz CL16. I have yet to finalize any CPU OC and memory OC, but, for now, I have PBO activated, and DOS (Dynamic OC Switching) set at 4.5GHz 1.25V.

You can ask any other question(s) if you want. I hope I'll have the time to do proper manual curves testing for the gpu, and pin-point my cpu and ram OC next week or the other.


----------



## DaftConspiracy

fraefm95 said:


> Why Time Spy? Extreme or normal?


Normal



syncro2020 said:


> http://www.3dmark.com/pr/799312
> 
> Thats nice it can boost to 2160, my asus was only able to go up to 2130 , anything after it would crash even with fully custom tuned curve.
> MSI Trio the hiest boost i've been able to get was 2070 i think, definitely worst card out of the 2 i had


Only got it to do that in port royal. Max stable clock I could get was actually 2145mhz in games and other benchmarks


----------



## fraefm95

acoustic said:


> I had a 9900K @ 5Ghz / 4.7Ghz cache, and pulled a 12826. I just recently grabbed the 10900K. For the 12840, I think my ambient temp was a tad lower and that allowed me one extra bin so I got a very small performance boost. Either that, or different driver.
> 
> 12826 w/ 9900K proof: https://www.3dmark.com/pr/693004
> 
> Looking at your score link for the 12823, even though you're at +140, you're clock and average clock speeds are lower than mine, even though we have identical cards on the same 450watt XOC bios. I'm not entirely sure why that is.
> 
> CPU doesn't matter very much with Port Royal. I see you hit 12823 though, now that's impressive! I honestly don't know but my card loves the higher voltages. If I try to undervolt like some are doing, I can't do anything. I can only do 1995Mhz @ 1.018v, and my memory is terrible, I can only get to +600 (when it's cold, usually +550) before error correction. When you crank the voltage slider up, my card goes nuts. I would really like to get it under a full waterblock, especially to help my mem OC, but the Hybrid does a pretty good job it seems. I'm competing against a lot of you guys with full blocks with custom loops. I'm not even running push-pull, just two Noctua A12x25s.
> 
> Either way, 12823 is a nice score! grats


Your temps are very nice for a hybrid cooler, i wouldnt put it on custom loop.

My ftw3 when on air cooler, was at 66/70°C and i could do +60/+1000.

Now with waterblock, I play CP2077, Watch Dogs Legion and others at constant 2100Mhz core with 10600Mhz memory. (+100/+1100)

More than that and crashes.

Never undervolted because temps are fine (<41°C)

You think its worth upgrading to 10900k?


----------



## fraefm95

DaftConspiracy said:


> Normal


TimeSpy Score: https://www.3dmark.com/3dm/57116888

CPU Score is lower and GPU Score is higher.


----------



## DaftConspiracy

fraefm95 said:


> TimeSpy Score: https://www.3dmark.com/3dm/57116888
> 
> CPU Score is lower and GPU Score is higher.


Interesting how you were able to get a better score than me in Port Royal but worse GPU score in time spy. Are you hitting power limit? I noticed timespy consumed a good bit more power for me.

Sent from my IN2025 using Tapatalk


----------



## mouacyk

Is there any analysis done between 3pin and 2pin pcbs to see if additional VRMs are warranted for the additional ~40% power? For example from the Aorus Master to Aorus Extreme. What's the likelihood that the 2pin version will hold up the same power through shunting with similar durability to the 3pin?


----------



## DaftConspiracy

mouacyk said:


> Is there any analysis done between 3pin and 2pin pcbs to see if additional VRMs are warranted for the additional ~40% power? For example from the Aorus Master to Aorus Extreme. What's the likelihood that the 2pin version will hold up the same power through shunting with similar durability to the 3pin?


Pretty high. I'm pulling nearly 450w though my 3080 tuf.

Sent from my IN2025 using Tapatalk


----------



## mouacyk

DaftConspiracy said:


> Pretty high. I'm pulling nearly 450w though my 3080 tuf.
> 
> Sent from my IN2025 using Tapatalk


From PCB shots, I see that the Strix has 2 more GPU phases? Is that just extra padding and justification to charge more for the extra power pin then?



















Please share if anyone comes across pcb shot for the Aorus 3080 Extreme. Thanks.
Here is the Aorus Master (funny it has all 6 POS caps, whereas cheaper Eagle OC has 1 MLC ):


----------



## acoustic

fraefm95 said:


> Your temps are very nice for a hybrid cooler, i wouldnt put it on custom loop.
> 
> My ftw3 when on air cooler, was at 66/70°C and i could do +60/+1000.
> 
> Now with waterblock, I play CP2077, Watch Dogs Legion and others at constant 2100Mhz core with 10600Mhz memory. (+100/+1100)
> 
> More than that and crashes.
> 
> Never undervolted because temps are fine (<41°C)
> 
> You think its worth upgrading to 10900k?


I definitely would NOT upgrade your 9900K to the 10900K. I only did it because I sold my 9900K+motherboard to a buddy so he could upgrade his 6700K, and sold some other parts. I ended up pocketing $200 after buying the 10900K and the motherboard.

It's been fun to play with some new hardware, but I haven't noticed any improvements in games, especially since I game at 3840x1600. I did some AC:Valhalla benches and at 5.1Ghz all-core on the 10900K, I'm still getting the exact same average FPS as my 9900K @ 5Ghz gave me. The 0.1% lows, 1% lows, and minimums did go up, but nothing major.


----------



## zlatanselvic

Will the XOC FTW3 450W BIOS work on the SuprimX?


----------



## DaftConspiracy

zlatanselvic said:


> Will the XOC FTW3 450W BIOS work on the SuprimX?


As long as it's a 3 pin card it should. If you don't have a bios switch I would make sure first though. With the switch it's pretty much impossible to brick a bios. Even if you flash a bad one you can just boot into the other bios, flip to the bad one (with windows running) and reflash over it.

Sent from my IN2025 using Tapatalk


----------



## edhutner

Suprim X should be capable to 430W, I dont think that extra 20watts would make a difference.


----------



## DaftConspiracy

edhutner said:


> Suprim X should be capable to 430W, I dont think that extra 20watts would make a difference.


You'd be amazed

Sent from my IN2025 using Tapatalk


----------



## Hirtle

Managed to get 13k in Port Royal. I thought I'd be able to get just a little more out of it but it's hitting power limit. Haven't decided if I want to shunt mod it yet.

https://www.3dmark.com/pr/790788


----------



## DaftConspiracy

Hirtle said:


> Managed to get 13k in Port Royal. I thought I'd be able to get just a little more out of it but it's hitting power limit. Haven't decided if I want to shunt mod it yet.
> 
> https://www.3dmark.com/pr/790788


Wow, what waterblock are you using?  My EK block will only go down to 44c, clearly these chips like cold.

Sent from my IN2025 using Tapatalk


----------



## Hirtle

DaftConspiracy said:


> Wow, what waterblock are you using?  My EK block will only go down to 44c, clearly these chips like cold.
> 
> Sent from my IN2025 using Tapatalk


It's an EK block. After having issues with their backplate, I double checked the water block for good contact. I made sure all the thermal pads made good contact as well. I saw your score averaged 39 C, is that with the EK block?


----------



## ssgwright

Colonel_Klinck said:


> Yeah I have R008 stacked on the stock. Tempted to try the R005 but don't want to overload the PCI/24pin.


I'm using R005, works fine.. although my PR score isn't much higher than yours : 12,980


----------



## SPL Tech

Hirtle said:


> Haven't decided if I want to shunt mod it yet.
> 
> https://www.3dmark.com/pr/790788


You want to shunt mod it. Do it. Your monitor will thank you.


----------



## DaftConspiracy

Hirtle said:


> It's an EK block. After having issues with their backplate, I double checked the water block for good contact. I made sure all the thermal pads made good contact as well. I saw your score averaged 39 C, is that with the EK block?


Yeah that was with fans at full speed. I'm using the ek tuf block with liquid metal. I've taken it apart a few times now and I've seen the contact couldn't be better if I wanted it to be. I know my loop can handle much more heat too, the water temp doesn't get nearly as warm with this GPU as it did with my 2080s, which ran at 38c with the fans practically idling. Think these jet plate designs are no good for setups with strong pumps, my last block was a flow through.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

SPL Tech said:


> You want to shunt mod it. Do it. Your monitor will thank you.


Yeah that's a golden chip right there, with a shunt mod you'll be able to hold 2170ish all the time. What does your voltage curve look like? 

Sent from my IN2025 using Tapatalk


----------



## Hirtle

SPL Tech said:


> You want to shunt mod it. Do it. Your monitor will thank you.


I really do. I just wish I could find a BIOS with a higher power limit instead. That way I could switch between that and stock and not worry about crazy power consumption if I'm just gaming.


----------



## blackzaru

DaftConspiracy said:


> Yeah that was with fans at full speed. I'm using the ek tuf block with liquid metal. I've taken it apart a few times now and I've seen the contact couldn't be better if I wanted it to be. I know my loop can handle much more heat too, the water temp doesn't get nearly as warm with this GPU as it did with my 2080s, which ran at 38c with the fans practically idling. Think these jet plate designs are no good for setups with strong pumps, my last block was a flow through.
> 
> Sent from my IN2025 using Tapatalk


What's your ambient temperature and cooling setup? I'm running a shunt modded TUF with the EK block and liquid metal, and even a full synthetic load on both the gpu and cpu (5950X), at the same time, has difficulty getting the gpu at in the 40 degrees, despite dumping around 750-800W of heat into the loop between the gpu and cpu. (Measured wattage, at the wall, around 850-875W.)


----------



## DaftConspiracy

blackzaru said:


> What's your ambient temperature and cooling setup? I'm running a shunt modded TUF with the EK block and liquid metal, and even a full synthetic load on both the gpu and cpu (5950X), at the same time, has difficulty getting the gpu at in the 40 degrees, despite dumping around 750-800W of heat into the loop between the gpu and cpu. (Measured wattage, at the wall, around 850-875W.)


Ambient temp is 68-70f, I'm running a 360x40mm rad and a 360x45mm rad with essentially a D5 pump but with much better head pressure and slightly worse flow (way better performance in all) in an o11d with a custom front open front panel. I am running water into technically the "out" port of the block, but ek claims it only makes 1-2c difference. Fans are nf-a12x25 on one rad and silent wings 3 on the other. CPU is a 5800x which isn't adding much heat to the loop.

Sent from my IN2025 using Tapatalk


----------



## TK421

do we have any info on vulcan/igame models? exclusive to asian and australian markets I think?


----------



## blackzaru

DaftConspiracy said:


> Ambient temp is 68-70f, I'm running a 360x40mm rad and a 360x45mm rad with essentially a D5 pump but with much better head pressure and slightly worse flow (way better performance in all) in an o11d with a custom front open front panel. I am running water into technically the "out" port of the block, but ek claims it only makes 1-2c difference. Fans are nf-a12x25 on one rad and silent wings 3 on the other. CPU is a 5800x which isn't adding much heat to the loop.
> 
> Sent from my IN2025 using Tapatalk


You room temperature is similar to mine. The main difference being that I'm running 2 360x60mm radiators (so, about 41% more surface to exchange heat than you (120mm (60+60) total thickness vs 85mm (40+45))) with Arctic P12 fans (I had Noctua industrial fans, but they were too noisy when ramping up too aggressively (the problems of having 3000rpm fans...), in a O11 dynamic as well. But even then, you've got enough cooling power to keep that temp under control. 44 degrees is not a "problem", but it's a bit weird. What did you shunt with? 5, or 8mOhm? (I'm on 8, so it might be the difference with you). Otherwise...Are you running your pump at very low rpm?


----------



## MrKenzie

TK421 said:


> do we have any info on vulcan/igame models? exclusive to asian and australian markets I think?


I am in Australia and have an iGame 3080 Advanced OC. It was performing good stock, but just bounced off the power limiter like most 3080's. I've now loaded the 450W Strix bios and have it running in my chilled water loop (aquarium chiller). It will pass benchmarks at around 2250MHz, and will game at between 2150-2220 depending on the game (higher crashes).

Using the 450W bios gained an average of about 80-100MHz, using chilled water gained a further 100MHz, overall I'm happy with it.


----------



## DaftConspiracy

blackzaru said:


> You room temperature is similar to mine. The main difference being that I'm running 2 360x60mm radiators (so, about 41% more surface to exchange heat than you (120mm (60+60) total thickness vs 85mm (40+45))) with Arctic P12 fans (I had Noctua industrial fans, but they were too noisy when ramping up too aggressively (the problems of having 3000rpm fans...), in a O11 dynamic as well. But even then, you've got enough cooling power to keep that temp under control. 44 degrees is not a "problem", but it's a bit weird. What did you shunt with? 5, or 8mOhm? (I'm on 8, so it might be the difference with you). Otherwise...Are you running your pump at very low rpm?


I stand corrected, I have a 45mm and a 55mm rad. The thing that bothers me is in games I'll see up to 47c and water temp never goes above 36c, where with my 2080s I was seeing water temp within 6c of my GPU. Pump speed stays at 100% at all times. I know my 2080s Oly consumed 280w average, but the temp difference is insane. I'm really tempted to remove my jet plate and see if it helps.

Sent from my IN2025 using Tapatalk


----------



## blackzaru

DaftConspiracy said:


> I stand corrected, I have a 45mm and a 55mm rad. The thing that bothers me is in games I'll see up to 47c and water temp never goes above 36c, where with my 2080s I was seeing water temp within 6c of my GPU. Pump speed stays at 100% at all times. I know my 2080s Oly consumed 280w average, but the temp difference is insane. I'm really tempted to remove my jet plate and see if it helps.
> 
> Sent from my IN2025 using Tapatalk


I see what it is, you water temp gets way higher than mine. Mine never breaches the 30C mark. I might be running a slightly more aggressive fan curve than you do, and the additional cooling power of my setup might also help. That 6 or more degrees of difference in water temps, might actually be what would push your gpu's temp down to my level, or near it. I can tell you that I do have my jet plate, although, the fact that you are running it reversed might be affected by the jetplate.


----------



## SPL Tech

DaftConspiracy said:


> Yeah that's a golden chip right there, with a shunt mod you'll be able to hold 2170ish all the time. What does your voltage curve look like?
> 
> Sent from my IN2025 using Tapatalk


haha I dont know about that. I am on water and I max out at 2085 with occasional bumps to 2100. Any higher and Cyperpunk 2077 crashes. I can get away with like 2140 for a single run in Superposition, but it's not stable on heavy gaming. In general the 3080 is not likely to get over 2100 for rock solid 24/7 stable. Anything above that and you're in an unstable configuration.


----------



## SPL Tech

Hirtle said:


> I really do. I just wish I could find a BIOS with a higher power limit instead. That way I could switch between that and stock and not worry about crazy power consumption if I'm just gaming.


It doesnt exist. Every BIOS that exists has been tested. They are more or less all the same.


----------



## SPL Tech

blackzaru said:


> time, has difficulty getting the gpu at in the 40 degrees


You should know that the GPU temp sensor is extremely inaccurate at low temps. It is designed to be most accurate at around 90C and the further you get from that the less accurate it is. In general, below 50C it's off by quite a bit. For example, at idle my water temp is about 35C but the GPU reads 31C. Obviously that's physically impossible.


----------



## DaftConspiracy

SPL Tech said:


> haha I dont know about that. I am on water and I max out at 2085 with occasional bumps to 2100. Any higher and Cyperpunk 2077 crashes. I can get away with like 2140 for a single run in Superposition, but it's not stable on heavy gaming. In general the 3080 is not likely to get over 2100 for rock solid 24/7 stable. Anything above that and you're in an unstable configuration.


Meant to direct that to Hirtle, my bad. Take a look at the frequency he's running in Port royal. My cards about the same silicon quality as yours though, if I up voltage to 1.1v I can run cyberpunk at 2130mhz. At 1.075v I run at 2100mhz. (Think at 1.081v it jumps to 2115mhz). I'll post my voltage curve tomorrow if I remember.

Strangely this card seems to scale better with voltage the higher the voltage is. I actually had to make the curve steeper than stock above 1.075v.

Sent from my IN2025 using Tapatalk


----------



## Mr Ripper

Has anyone got a Gigabyte Master rev 2.0 (*3x8pin *version)? 








GIGABYTE AORUS RTX 3080 MASTER Rev. 2.0 Specs


NVIDIA GA102, 1845 MHz, 8704 Cores, 272 TMUs, 96 ROPs, 10240 MB GDDR6X, 1188 MHz, 320 bit




www.techpowerup.com





Looks like they consolidated the Xtreme 3 pin design to save on production costs. I see no bios yet and was wondering whether it could run at the same 450w power as the Xtreme with the right bios.


----------



## FedeX299I57640X

I must ordered 1 gpu and i have see a zotac trinity 3080 and a 3080 holo amp anyone have this gpu?
So i must have a choice!


----------



## Hirtle

SPL Tech said:


> It doesnt exist. Every BIOS that exists has been tested. They are more or less all the same.


Right, the Strix already has 450W, which seems to be the highest you can get on a 3080. I know there's a 1000W BIOS out there for the 3090. With that one being the "flagship" I guess there won't be a BIOS like that for the 3080.


----------



## Colonel_Klinck

ssgwright said:


> I'm using R005, works fine.. although my PR score isn't much higher than yours : 12,980



Ah ok, I'll replace the 8 with 5 then. I have plenty of radiator surface area to lose the heat.


----------



## scanz

Mr Ripper said:


> Has anyone got a Gigabyte Master rev 2.0 (*3x8pin *version)?
> 
> 
> 
> 
> 
> 
> 
> 
> GIGABYTE AORUS RTX 3080 MASTER Rev. 2.0 Specs
> 
> 
> NVIDIA GA102, 1845 MHz, 8704 Cores, 272 TMUs, 96 ROPs, 10240 MB GDDR6X, 1188 MHz, 320 bit
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Looks like they consolidated the Xtreme 3 pin design to save on production costs. I see no bios yet and was wondering whether it could run at the same 450w power as the Xtreme with the right bios.


So sad about this, makes me hate my current Master because I feel it has so much headroom for tweaking (temps never go higher than 60c no matter what i throw at it), but is severely hampered by being power limited. Don't understand why they didn't provide a 3x8pin to begin with.

I'd be interested to know if anyone here has been able to push out any extra performance out of the current Master and if so what they did. I've given mine slightly higher core and memory clock, but only so much I feel i can do due to being power limited.


----------



## Nizzen

scanz said:


> So sad about this, makes me hate my current Master because I feel it has so much headroom for tweaking (temps never go higher than 60c no matter what i throw at it), but is severely hampered by being power limited. Don't understand why they didn't provide a 3x8pin to begin with.
> 
> I'd be interested to know if anyone here has been able to push out any extra performance out of the current Master and if so what they did. I've given mine slightly higher core and memory clock, but only so much I feel i can do due to being power limited.


Shunt mod it....


----------



## TK421

MrKenzie said:


> I am in Australia and have an iGame 3080 Advanced OC. It was performing good stock, but just bounced off the power limiter like most 3080's. I've now loaded the 450W Strix bios and have it running in my chilled water loop (aquarium chiller). It will pass benchmarks at around 2250MHz, and will game at between 2150-2220 depending on the game (higher crashes).
> 
> Using the 450W bios gained an average of about 80-100MHz, using chilled water gained a further 100MHz, overall I'm happy with it.


using strix is fine for 2x8 card?

do you see any erroneous power consumption readings with the strix bios?


----------



## DaftConspiracy

Colonel_Klinck said:


> Ah ok, I'll replace the 8 with 5 then. I have plenty of radiator surface area to lose the heat.


Make sure you shunt all 5 resistors near the power connectors, I didn't do that and I'm hitting power limit on the core now. Also had to add a 0.04 ohm resistor to my pcie because it only wanted to draw 67w.

Sent from my IN2025 using Tapatalk


----------



## zlatanselvic

Some updates:

I've built a few different rigs for either myself or for customers in the last 2 months. Here's some real world comparisons vs. what I'm seeing all the YouTubers post. Kind of a fun little research project 

The reference PCB cards all max out on air between 500 - 1000 points in Timespy for graphics score regardless of BIOS vs. top tier cards. Shunting opens more doors, and water stabilizes potential clocks. We shunted a buddies 3090 TUF and saw some good results. Your boost bins become more consistent as you can reach higher clocks by 15mhz for X amount of less temp. IIRC it's 15mhz per 10c, I know someone in this thread knows. 


The top tier cards really benefit from watts, I was able to pick up another boost bin from the MSI Suprim X to the 450W XOC bios. This was all thanks to the 20 more watts.











All in all, the performance gains this generation on hardcore overclocking are minimal. There's not much meat on the bone compared to before. For most users, a nice wattage bios and aggressive fan curve will get you close to what you can get from the card. Anything beyond that requires a lot of effort and heat for minimal results.


That said, the fun never stops!


----------



## Colonel_Klinck

DaftConspiracy said:


> Make sure you shunt all 5 resistors near the power connectors, I didn't do that and I'm hitting power limit on the core now. Also had to add a 0.04 ohm resistor to my pcie because it only wanted to draw 67w.
> 
> Sent from my IN2025 using Tapatalk



I have R008 on all 6 shunts at present. The plan is to switch them all to R005


----------



## BMauri92

Guys I just watercooler my 3080 trio, there is possibility on swap the BIOS to any higher TDP, current I'm sitting on 340w and 24/30 on temp lol


----------



## DaftConspiracy

Colonel_Klinck said:


> I have R008 on all 6 shunts at present. The plan is to switch them all to R005


If anything put a higher resistance shunt on the pcie, once you put 5mohm shunts on the others it'll let the pcie draw up to 150w with a 5mohm shunt or 121w with an 8mohm shunt. I would stick with 30mohm or 40mohm for the pcie. Thatll put it at 84w or 88w respectively. You'll still be able to draw 450-500w total with one of those.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

BMauri92 said:


> Guys I just watercooler my 3080 trio, there is possibility on swap the BIOS to any higher TDP, current I'm sitting on 340w and 24/30 on temp lol


Flash a strix bios or the 450w ftw3 bios on it

Sent from my IN2025 using Tapatalk


----------



## Ramshot

Vapochilled said:


> After yes on the flash procedure... Screen went black.... Nothing else
> 
> I did a reboot...nothing.. but I could see the LED reads.. so windows is working
> 
> I had in my mind how to flash back.. I recorded all the steps thru keyboard hahaha in case the screen went black... So I flashed back eagle and gaming oc bios.. they both work..aorus didn't...
> 
> Someone else want to try?


I did it. You can flash the Auros extreme. Just make sure you plug in to the DP port furthest form the pcie pin side. Mine went black as well on the middle DP.
Its messed though. It shows lower gpu temps then they should be and it says you hit like 440watts in msi afterburner. Set it to 122% powerlimit as well. netted me know extraa performance. compard to my undervolt. 18456 GPU score in time spy undervolt. which was slightly better then the aorus bios.


----------



## Ramshot

Vapochilled said:


> Conclusion: I have a gigabyte eagle oc, it works with gaming oc bios, but not with that aorus bios. ((
> 
> Anyone else wants to give it a try?


It does work. Just dont use the middle Display port.


----------



## BMauri92

DaftConspiracy said:


> Flash a strix bios or the 450w ftw3 bios on it
> 
> Sent from my IN2025 using Tapatalk


any idea how to do it lol. first time flashing a gpu


----------



## SPL Tech

DaftConspiracy said:


> Meant to direct that to Hirtle, my bad. Take a look at the frequency he's running in Port royal. My cards about the same silicon quality as yours though, if I up voltage to 1.1v I can run cyberpunk at 2130mhz. At 1.075v I run at 2100mhz. (Think at 1.081v it jumps to 2115mhz). I'll post my voltage curve tomorrow if I remember.


How do you get to 1.1v? I thought these cards are hard capped at 1.075 with the card rarely going over 1.06? GPUZ is telling me I am VOp at 1.07 and it wont boost higher than that.


----------



## DaftConspiracy

SPL Tech said:


> How do you get to 1.1v? I thought these cards are hard capped at 1.075 with the card rarely going over 1.06? GPUZ is telling me I am VOp at 1.07 and it wont boost higher than that.


You need a really high power limit and you need to max the voltage slider in afterburner/X1/whatever you use. I couldnt get it to go to 1.1v until I flashed a rev 1.0 tuf OC bios on mine. With the updated tuf bios it'll only go to 1.075v. I assume it's something Nvidia did the boost algorithm to prevent the crashes most cards were having. Odds are a rev 1.0 bios of most cards will let you load that high as long as power permits it.
You also have to make sure to setup the voltage/fr curve so at 1.1v it goes up 15mhz from wherever it is at 1.091v. The boost algorithm with only run at the lowest voltage set for a 15mhz "bin."

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

Contrary to what most people think the voltage slider isn't actually placebo. It won't offset the voltage or alter the voltage/fr curve, but it will adjust the voltage limit. Say you set the card to hit 2.1ghz at 1.075v and 2.13ghz at 1.1v. be default it will only go up to 2.1ghz at 1.075v, but if you set the voltage to 100% it'll go up to 2.13ghz at 1.1v. Keep in mind if you tell it to hit 2.13ghz at 1.081v and maintain that frequency through 1.1v it will only ever set voltage to 1.081v, since that's the minimum you told it that it could run at that frequency.

Sent from my IN2025 using Tapatalk


----------



## SPL Tech

DaftConspiracy said:


> You need a really high power limit and you need to max the voltage slider in afterburner/X1/whatever you use. I couldnt get it to go to 1.1v until I flashed a rev 1.0 tuf OC bios on mine. With the updated tuf bios it'll only go to 1.075v. I assume it's something Nvidia did the boost algorithm to prevent the crashes most cards were having. Odds are a rev 1.0 bios of most cards will let you load that high as long as power permits it.
> You also have to make sure to setup the voltage/fr curve so at 1.1v it goes up 15mhz from wherever it is at 1.091v. The boost algorithm with only run at the lowest voltage set for a 15mhz "bin."
> 
> Sent from my IN2025 using Tapatalk


I have an Evga 3080 XC3 and the voltage slider is disabled in Afterburner. What does that mean? I shunt moded my card so power limit is not an issue.


----------



## DaftConspiracy

SPL Tech said:


> I have an Evga 3080 XC3 and the voltage slider is disabled in Afterburner. What does that mean? I shunt moded my card so power limit is not an issue.


That's just the default setting. You need the latest beta version of afterburner and you have to enable voltage control in settings.

Sent from my IN2025 using Tapatalk


----------



## noxyd

Hey folks, 

3080 FTW3 with XOC bios and EK block here.
I have a sffpc so my loop is way less powerful than some other here (makes me wonder if anything's wrong actually!). 

19,827 pts in Time Spy 








I scored 16 556 in Time Spy


Intel Core i7-9700K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





voltage +100%
core +80 (won't go above)
memory +1200

average freq is 2055 mhz so quite lower than others, wonder if it's because of my higher temperature. 
However, score seems decent. 
I ran the test using my regular fan curves (so very quiet).

Have you guys tried OC Scanner ? It seem to work well until a certain moment where it goes all over the place reaching power limit at low voltage.


----------



## BMauri92

Guys I was able to flash my trio to the evga bios and this was my result 

https://www.3dmark.com/3dm/57308772


----------



## BluePaint

@BMauri92 
Good frequencies and temps. With an OC intel cpu and 4000+ RAM u would have 20.000 GPU pts in TS


----------



## Felgor

I got the Zotac 3080 AMP Holo 340w bios loaded on my Galax 3080 SG without issue, and it tops out at 345w now so has netted me a few clock bins but the TDP slider in Afterburner 4.6.3beta makes no difference from 100 to 110%. Max TDP is 101.5% at 100 or 110.

Perfcap is Power.

Does this mean my only option is to shunt mod or should I try another bios? I only need 1x DP output.

Edit: The pcie 12v rail was sagging to 12.5 or 12.6v and coincided with every power limit. I have just tested with a Zalman 850w on the pcie and the results are increased tdp before power limit and curve locked clock at 2040 was stable, just 2025 or 2040...2040 for most of the Heaven run. Back to just the Corsair psu and clocks bounce around 1965 to 2010.


----------



## MrKenzie

TK421 said:


> using strix is fine for 2x8 card?
> 
> do you see any erroneous power consumption readings with the strix bios?


It is a 3x 8-pin card so no issues at all.
Unfortunately the memory seems bad on my card, anything over +550 is instant driver crash, even with 5c water temp cooling them. I see a lot of people running +1100-1200.


----------



## TK421

MrKenzie said:


> It is a 3x 8-pin card so no issues at all.
> Unfortunately the memory seems bad on my card, anything over +550 is instant driver crash, even with 5c water temp cooling them. I see a lot of people running +1100-1200.


sorry to hear about the memory

I still haven't done any test with my strix but it seems if I just set it at +500 or +600 it's fine

core is 2085/2100mhz, still on air



really curious to see what the pcb of the vulcan looks like though


----------



## Colonel_Klinck

DaftConspiracy said:


> If anything put a higher resistance shunt on the pcie, once you put 5mohm shunts on the others it'll let the pcie draw up to 150w with a 5mohm shunt or 121w with an 8mohm shunt. I would stick with 30mohm or 40mohm for the pcie. Thatll put it at 84w or 88w respectively. You'll still be able to draw 450-500w total with one of those.
> 
> Sent from my IN2025 using Tapatalk


I was never getting more than 56w reported through the PCI before modding it. 

I thought you had to use the same shunts on all or it would mess up the current balancing on the card?


----------



## DaftConspiracy

Colonel_Klinck said:


> I was never getting more than 56w reported through the PCI before modding it.
> 
> I thought you had to use the same shunts on all or it would mess up the current balancing on the card?


It won't mess up current balancing, each one of those resistors measure power limit for a separate component. If any of the 6 hit their limit the whole card will stop drawing additional power. The card will always draw 21.4% of it's power from pcie and the rest from the 8 pins (if it's a 2x8 pin card), regardless of what shunts are used. Different shunts will only alter the reported power (effectively altering the limit for the components it's reporting for).

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

The only way to effectively alter the load balancing would be (in theory) to alter voltage. Overvolting would increase the amount of power that can be safely drawn through each connection (since current is the risk factor), but it may damage components on the GPU depending on their designed opperating voltage. Undervolting would reduce power consumption, but wouldn't provide any additional safety since current would be the same.

Sent from my IN2025 using Tapatalk


----------



## Colonel_Klinck

Ok thanks.


----------



## mouacyk

My hunch about the Byski Eagle waterblock making poor contact resulting in high temps in 50C's is correct. Fortunately, I did not have to file the standoffs as suggested on here, but ... *the standoffs were NOT completely torqued from the manufacturer*. I used contact paper to test 3 settings: (*a*) stock (*b*) thicker solid washer (from rockit) and (*c*) torquing standoffs fully, used thicker solid washer, and mounted a copper heatsink on backplate. While there wasn't much difference in the imprint from *a* to *b*, *c* showed remarkably more coverage on the GPU die. This ended up giving me load temps under furmark stress testing at 360W in the high 30's, maxing around 38-39C. With stock mount *a*, I was easily hitting 57-58C. I think the copper heatsink on the backplate is negligible and the main factors are (1) torquing the standoffs completely and (2) torquing the mounting screws slowly but also completely.










So, the max delta of 17C over 22C ambient is much more in line with what I expected to get from watercooling the 3080, based on my past experiences with a fully power-unlimited 1080TI. My original delta of 36C felt very wrong and is indeed wrong.

Modern Warfare after 2 hours, fps limit=138 @ 144Hz:









Here's a chart at HardwareLuxx that shows various blocks and their temperature deltas. It gives an idea of the best and worst mounts, but rad space may be an unknown factor for many of the setups:

[Übersicht] - RTX 30x0 Wasserkühlervergleich | GPU Block Comparison | Hardwareluxx

Surprise -- there's ongoing discussion in the 3090 owner's thread about Bykski blocks not clearing VRM capacitors... GPU supply was scarce enough, I wonder what Bykski's QC rush was/is.


----------



## Krisztias

BMauri92 said:


> Guys I was able to flash my trio to the evga bios and this was my result
> 
> https://www.3dmark.com/3dm/57308772


Hi!

Your result looks little low to me for an 3pin card.
This is mine with an FE (2pin), +15% power (370W), +125 core, +1150 on mem and with 3800X:

Time Spy:








I scored 17 640 in Time Spy


AMD Ryzen 7 3800X, NVIDIA GeForce RTX 3080 x 1, 16300 MB, 64-bit Windows 10}




www.3dmark.com





Port Royal:








I scored 12 588 in Port Royal


AMD Ryzen 7 3800X, NVIDIA GeForce RTX 3080 x 1, 16300 MB, 64-bit Windows 10}




www.3dmark.com





Are the RAM settings ok? No software problems or to much processes in background?


----------



## DaftConspiracy

mouacyk said:


> My hunch about the Byski Eagle waterblock making poor contact resulting in high temps in 50C's is correct. Fortunately, I did not have to file the standoffs as suggested on here, but ... *the standoffs were NOT completely torqued from the manufacturer*. I used contact paper to test 3 settings: (*a*) stock (*b*) thicker solid washer (from rockit) and (*c*) torquing standoffs fully, used thicker solid washer, and mounted a copper heatsink on backplate. While there wasn't much difference in the imprint from *a* to *b*, *c* showed remarkably more coverage on the GPU die. This ended up giving me load temps under furmark stress testing at 360W in the high 30's, maxing around 38-39C. With stock mount *a*, I was easily hitting 57-58C. I think the copper heatsink on the backplate is negligible and the main factors are (1) torquing the standoffs completely and (2) torquing the mounting screws slowly but also completely.
> 
> View attachment 2475753
> 
> 
> So, the max delta of 17C over 22C ambient is much more in line with what I expected to get from watercooling the 3080, based on my past experiences with a fully power-unlimited 1080TI. My original delta of 36C felt very wrong and is indeed wrong.
> 
> Here's a chart at HardwareLuxx that shows various blocks and their temperature deltas. It gives an idea of the best and worst mounts, but rad space may be an unknown factor for many of the setups:
> 
> [Übersicht] - RTX 30x0 Wasserkühlervergleich | GPU Block Comparison | Hardwareluxx
> 
> Surprise -- there's ongoing discussion in the 3090 owner's thread about Bykski blocks not clearing VRM capacitors... GPU supply was scarce enough, I wonder what Bykski's QC rush was/is.


Makes me want to check mine on my EK block. My temps aren't too bad but I was able to get a delta of 6-7c between water and die on my 2080s after grinding down the block standoffs (evga HC block). So either my mount pressure is too low or the cooling engine isn't optimized for the high flow & pressure of my pump.

Sent from my IN2025 using Tapatalk


----------



## mouacyk

DaftConspiracy said:


> Makes me want to check mine on my EK block. My temps aren't too bad but I was able to get a delta of 6-7c between water and die on my 2080s after grinding down the block standoffs (evga HC block). So either my mount pressure is too low or the cooling engine isn't optimized for the high flow & pressure of my pump.
> 
> Sent from my IN2025 using Tapatalk


If anyone is getting unexpected deltas, it's worth checking. You can easily tell if the standoffs are not torqued enough -- they'll be quite loose. Please be careful though and have some way of checking for even mounting pressure, because the manufacturer might have tuned the standoffs for even mounting pressure on the die versus maximum mounting pressure. In my case, the contact paper helped immensely because I wasn't just torquing the standoffs blindly. Luckily, I didn't have to experiment with various levels of torquing either -- just doing it once fully resulted in the more even contact and better results. In my sample, I would say it was machined well, just loosely/badly assembled.


----------



## noxyd

mouacyk said:


> My hunch about the Byski Eagle waterblock making poor contact resulting in high temps in 50C's is correct. Fortunately, I did not have to file the standoffs as suggested on here, but ... *the standoffs were NOT completely torqued from the manufacturer*. I used contact paper to test 3 settings: (*a*) stock (*b*) thicker solid washer (from rockit) and (*c*) torquing standoffs fully, used thicker solid washer, and mounted a copper heatsink on backplate. While there wasn't much difference in the imprint from *a* to *b*, *c* showed remarkably more coverage on the GPU die. This ended up giving me load temps under furmark stress testing at 360W in the high 30's, maxing around 38-39C. With stock mount *a*, I was easily hitting 57-58C. I think the copper heatsink on the backplate is negligible and the main factors are (1) torquing the standoffs completely and (2) torquing the mounting screws slowly but also completely.
> 
> View attachment 2475753
> 
> 
> So, the max delta of 17C over 22C ambient is much more in line with what I expected to get from watercooling the 3080, based on my past experiences with a fully power-unlimited 1080TI. My original delta of 36C felt very wrong and is indeed wrong.
> 
> Here's a chart at HardwareLuxx that shows various blocks and their temperature deltas. It gives an idea of the best and worst mounts, but rad space may be an unknown factor for many of the setups:
> 
> [Übersicht] - RTX 30x0 Wasserkühlervergleich | GPU Block Comparison | Hardwareluxx
> 
> Surprise -- there's ongoing discussion in the 3090 owner's thread about Bykski blocks not clearing VRM capacitors... GPU supply was scarce enough, I wonder what Bykski's QC rush was/is.


I'm seeing 54C under load with my EK block so it makes me wonder... 
I was expecting less. 
Unfortunately, removing my gpu is a real nightmare in my sff case...


----------



## mouacyk

noxyd said:


> I'm seeing 54C under load with my EK block so it makes me wonder...
> I was expecting less.
> Unfortunately, removing my gpu is a real nightmare in my sff case...


I have mine in a pretty cramped FT03-T mATX case and it took some dedication to disassemble and reassemble it too. It was worth it in the end, because I can tell the contact was more even and the result was better than I expected. On the upside, you end up correcting a shoddy manufacturer assembly that when left uncorrected at best hampers your GPU stability and at worst can burn out your GPU in long duration stressing. Another issue others ran into with blocks is thermal throttling on these very hot GDDR6X VRAM chips -- my guess again is poor contact based on (less likely) machining or (more likely) assembly defects. With the cost and scarcity of these GPUs, I think it worth every effort to inspect all that you can.


----------



## BurrZerkaa

DaftConspiracy said:


> Contrary to what most people think the voltage slider isn't actually placebo. It won't offset the voltage or alter the voltage/fr curve, but it will adjust the voltage limit. Say you set the card to hit 2.1ghz at 1.075v and 2.13ghz at 1.1v. be default it will only go up to 2.1ghz at 1.075v, but if you set the voltage to 100% it'll go up to 2.13ghz at 1.1v. Keep in mind if you tell it to hit 2.13ghz at 1.081v and maintain that frequency through 1.1v it will only ever set voltage to 1.081v, since that's the minimum you told it that it could run at that frequency.


Thanks for this. I'm running a 2-pin Gigabyte Gaming OC and when adjusting my curve I was ignoring the 15mhz offset, which was my issue apparently. So far I'm completely stable at +500 mem and 2050Mhz core @ 1.1v. I'm thinking I might be able to hit 2100Mhz stable as I was benching at 2100mhz without realizing I wasn't offsetting properly in the curve. Anyways, thanks for the tip. On to push this baby a little more.


----------



## noxyd

mouacyk said:


> I have mine in a pretty cramped FT03-T mATX case and it took some dedication to disassemble and reassemble it too. It was worth it in the end, because I can tell the contact was more even and the result was better than I expected. On the upside, you end up correcting a shoddy manufacturer assembly that when left uncorrected at best hampers your GPU stability and at worst can burn out your GPU in long duration stressing. Another issue others ran into with blocks is thermal throttling on these very hot GDDR6X VRAM chips -- my guess again is poor contact based on (less likely) machining or (more likely) assembly defects. With the cost and scarcity of these GPUs, I think it worth every effort to inspect all that you can.


Out of curiosity, what radiator size are you using?

I tried to check quickly, the standoffs I can see don’t seem to be loose (but can’t see all of them)


----------



## DaftConspiracy

BurrZerkaa said:


> Thanks for this. I'm running a 2-pin Gigabyte Gaming OC and when adjusting my curve I was ignoring the 15mhz offset, which was my issue apparently. So far I'm completely stable at +500 mem and 2050Mhz core @ 1.1v. I'm thinking I might be able to hit 2100Mhz stable as I was benching at 2100mhz without realizing I wasn't offsetting properly in the curve. Anyways, thanks for the tip. On to push this baby a little more.


You should be able to run 2130mhz at 1.1v in benchmarks as long as you didn't complete get screwed by the silicon lottery. My card can run that completely stable in games, and I'd say its pretty average silicon. I run +800 on mem but I'm still playing with it. Any higher and I stopped seeing gains in time spy, so I assume it starts hitting errors above or at +800.

Sent from my IN2025 using Tapatalk


----------



## noxyd

DaftConspiracy said:


> You should be able to run 2130mhz at 1.1v in benchmarks as long as you didn't complete get screwed by the silicon lottery. My card can run that completely stable in games, and I'd say its pretty average silicon. I run +800 on mem but I'm still playing with it. Any higher and I stopped seeing gains in time spy, so I assume it starts hitting errors above or at +800.
> 
> Sent from my IN2025 using Tapatalk


Except if you're talking about a shunt modded card, I don't think that 2130 mhz @1,1v is "average silicon"


----------



## mouacyk

noxyd said:


> Out of curiosity, what radiator size are you using?
> 
> I tried to check quickly, the standoffs I can see don’t seem to be loose (but can’t see all of them)


XPSC-EX120 + GTX360. What's important about my result is that I didn't change the water loop, but got a 19C change in delta over ambient.


----------



## MrBridgeSix

My RTX 3080 Eagle OC stopped booting after 3 months, not related to the power connectors, they looked fine. Luckily the retailer where I bought the card handled the RMA and will be sending me a Palit RTX 3080 Gaming Pro as a replacement because that is what they have in stock.

Checked the BIOS database and this model seems to have a 350W PL instead of 375W, any Gaming Pro owners here have been testing other BIOS?


----------



## DaftConspiracy

noxyd said:


> Except if you're talking about a shunt modded card, I don't think that 2130 mhz @1,1v is "average silicon"


I've seen people run 2200mhz stable on these, now those are golden chips. I'd say mine is pretty average, maybe slightly above. Highest I can run Port Royal is 2175mhz, and that's really pushing it. 

Speaking of Port Royal, I just discovered that the higher your vram frequency the better your score. If doesn't matter if it's throwing mad errors, somehow it still likes it. Maybe the more errors it corrects the better the score 

Going from +800mhz (what I guestimate to be "stable") to +1500mhz net me a 150 point increase in score. Seems the same with time spy. Error correcting memory is so confusing, I want my old regular memory back so I know when it is and isn't working.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

MrBridgeSix said:


> My RTX 3080 Eagle OC stopped booting after 3 months, not related to the power connectors, they looked fine. Luckily the retailer where I bought the card handled the RMA and will be sending me a Palit RTX 3080 Gaming Pro as a replacement because that is what they have in stock.
> 
> Checked the BIOS database and this model seems to have a 350W PL instead of 375W, any Gaming Pro owners here have been testing other BIOS?


You can't really increase power limit on the 2x8 pin 3080s via the power slider. They pretty much will all draw up to 360w (according to gpu-z Hwinfo reads a max of 350w) regardless of bios.

Sent from my IN2025 using Tapatalk


----------



## BurrZerkaa

DaftConspiracy said:


> You should be able to run 2130mhz at 1.1v in benchmarks as long as you didn't complete get screwed by the silicon lottery. My card can run that completely stable in games, and I'd say its pretty average silicon. I run +800 on mem but I'm still playing with it. Any higher and I stopped seeing gains in time spy, so I assume it starts hitting errors above or at +800.


Yep, bumped it up to 2130mhz with only +500mhz on mem. Have been gaming without issue. I'll have to check complete stability later. Right now gaming seems to be good though.


----------



## DaftConspiracy

I swear I'm done now

Port Royal:








I scored 12 919 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





Time Spy:








I scored 18 147 in Time Spy


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





I could probably squeeze a couple more point out of Port Royal, and a decent bit out of time spy if I bring up the minimum frequencies, but that's a lot of work for not much reward.

Side note, anyone know a good way to test ddr6x stability? Since benchmarks clearly don't give any useful information.

Sent from my IN2025 using Tapatalk


----------



## geriatricpollywog

DaftConspiracy said:


> I swear I'm done now


You're gonna get beaten by a girl? (not me, the result I linked is TastyPC from youtube)

NVIDIA GeForce RTX 3080 video card benchmark result - AMD Ryzen 9 3900X,Gigabyte Technology Co., Ltd. X570 AORUS MASTER (3dmark.com)


----------



## noxyd

DaftConspiracy said:


> I swear I'm done now
> 
> Port Royal:
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 919 in Port Royal
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Time Spy:
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 147 in Time Spy
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I could probably squeeze a couple more point out of Port Royal, and a decent bit out of time spy if I bring up the minimum frequencies, but that's a lot of work for not much reward.
> 
> Side note, anyone know a good way to test ddr6x stability? Since benchmarks clearly don't give any useful information.
> 
> Sent from my IN2025 using Tapatalk


What I don’t understand is how I get the same TS score as you but with 100mhz less (and 10C for that matter)...


----------



## mouacyk

DaftConspiracy said:


> I swear I'm done now
> 
> Port Royal:
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 919 in Port Royal
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Time Spy:
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 147 in Time Spy
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I could probably squeeze a couple more point out of Port Royal, and a decent bit out of time spy if I bring up the minimum frequencies, but that's a lot of work for not much reward.
> 
> Side note, anyone know a good way to test ddr6x stability? Since benchmarks clearly don't give any useful information.
> 
> Sent from my IN2025 using Tapatalk


RTX 3080 Memory Checker:


----------



## VPII

BMauri92 said:


> Guys I was able to flash my trio to the evga bios and this was my result
> 
> https://www.3dmark.com/3dm/57308772


My friend, if you are running a MSI RTX 3080 Gaming X Trio the best bios I found working for this card was the MSI RTX 3080 Suprim X due to the fact that it uses the same fans and as such the fan speeds would be similar. Just a suggestion


----------



## ssgwright

DaftConspiracy said:


> I swear I'm done now
> 
> Port Royal:
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 919 in Port Royal
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Time Spy:
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 147 in Time Spy
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I could probably squeeze a couple more point out of Port Royal, and a decent bit out of time spy if I bring up the minimum frequencies, but that's a lot of work for not much reward.
> 
> Side note, anyone know a good way to test ddr6x stability? Since benchmarks clearly don't give any useful information.
> 
> Sent from my IN2025 using Tapatalk


oh come on.. you can do better:









I scored 12 980 in Port Royal


Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## MrKenzie

Any suggestions why my driver crashes when overclocking memory past +550MHz? I've seen others mention their performance goes down with memory errors, but mine doesn't do that at all, it just crashes to desktop. It happens in benchmarks and also games.

I have an iGame Advanced OC, 450W bios, water cooled, and chilled to between 5c-15c so temperature isn't the issue.
Before water cooling the maximum memory overclock was +400MHz any higher and it would crash to desktop.

I assume I just have poor memory modules compared to some, it's a shame because I could easily reach 13,000pts in Port Royal is they clocked to +800MHz or higher.








I scored 12 832 in Port Royal


Intel Core i7-7700K Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com


----------



## DaftConspiracy

noxyd said:


> What I don’t understand is how I get the same TS score as you but with 100mhz less (and 10C for that matter)...


Ive noticed that when I push the card past full gaming stability the score drops a decent bit, but if I push it high enough I can compensate for it. Also not sure what your power limit is but mine bounces off limit quite a bit in part 2 of the test.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

ssgwright said:


> oh come on.. you can do better:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 980 in Port Royal
> 
> 
> Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


Seems that 3d mark favors Intel CPUs for some reason

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

MrKenzie said:


> Any suggestions why my driver crashes when overclocking memory past +550MHz? I've seen others mention their performance goes down with memory errors, but mine doesn't do that at all, it just crashes to desktop. It happens in benchmarks and also games.
> 
> I have an iGame Advanced OC, 450W bios, water cooled, and chilled to between 5c-15c so temperature isn't the issue.
> Before water cooling the maximum memory overclock was +400MHz any higher and it would crash to desktop.
> 
> I assume I just have poor memory modules compared to some, it's a shame because I could easily reach 13,000pts in Port Royal is they clocked to +800MHz or higher.
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 832 in Port Royal
> 
> 
> Intel Core i7-7700K Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


That's strange, seems like the error correcting function isn't working on yours

Sent from my IN2025 using Tapatalk


----------



## Arni90

ssgwright said:


> oh come on.. you can do better:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 980 in Port Royal
> 
> 
> Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


Oh come on... You can do better: https://www.3dmark.com/pr/380842


----------



## leegoocrap

Came a long way with the xc3... Hoping there is still a little more to squeeze out









I scored 12 596 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com


----------



## mouacyk

Thanks to @Mumak, HWiNFO now reports GPU Memory junction temperature:








[Official] NVIDIA RTX 3090 Owner's Club


This is only a ~320W ETH mining load on both tests. I'll try to push this card when I have more time to mess with it. Before (GPU was 60C): After (water): whats your cooling solution? im at 43c on 360mmx60mm rad (also 9900k in loop) with same power draw while mining, 23c ambient air temp...




www.overclock.net


----------



## ssgwright

Arni90 said:


> Oh come on... You can do better: https://www.3dmark.com/pr/380842


lol, nice score!


----------



## Imprezzion

I've had 3 shops today alone cancel my order even tho they showed stock when I ordered the cards.. it's absolutely impossible to get a 3080 now in the Netherlands.. every now and then stock pops up but if you order then they don't seem to exist..

The past week or so I shuffled over 5 grand between shops and orders and all in all I'm getting it all back and still don't actually own a card.. for gods sake...

I'm almost at the point where I'm like frigg it I'm just going to order a Gigabyte 3090 Aorus Master which is actually in stock for a change even tho it's well over 2 grand... Patience is a virtue..


----------



## Colonel_Klinck

mouacyk said:


> Thanks to @Mumak, HWiNFO now reports GPU Memory junction temperature:
> 
> 
> 
> 
> 
> 
> 
> 
> [Official] NVIDIA RTX 3090 Owner's Club
> 
> 
> This is only a ~320W ETH mining load on both tests. I'll try to push this card when I have more time to mess with it. Before (GPU was 60C): After (water): whats your cooling solution? im at 43c on 360mmx60mm rad (also 9900k in loop) with same power draw while mining, 23c ambient air temp...
> 
> 
> 
> 
> www.overclock.net


Doesn't give me that reading on my TUF OC


----------



## josephimports

Colonel_Klinck said:


> Doesn't give me that reading on my TUF OC


Be sure to use the official v6.42 and not the beta. Works fine on my TUF with max reported temp of 52c under water with +1000 mhz.


----------



## Colonel_Klinck

josephimports said:


> Be sure to use the official v6.42 and not the beta. Works fine on my TUF with max reported temp of 52c under water with +1000 mhz.



It was hiding down with my System Punp 5 in _Windows Hardware Errors (WHEA) _No idea why


----------



## DaftConspiracy

Took the jet plate out of my GPU block and delta between the die and water dropped 4c so that's pretty sweet. Jet plates definitely only help if you have a weak pump or run low pump speed. If you run a D5 or better at full speed I say take them out.

Sent from my IN2025 using Tapatalk


----------



## mouacyk

DaftConspiracy said:


> Took the jet plate out of my GPU block and delta between the die and water dropped 4c so that's pretty sweet. Jet plates definitely only help if you have a weak pump or run low pump speed. If you run a D5 or better at full speed I say take them out.
> 
> Sent from my IN2025 using Tapatalk


That's interesting, good find.


----------



## ssgwright

DaftConspiracy said:


> Took the jet plate out of my GPU block and delta between the die and water dropped 4c so that's pretty sweet. Jet plates definitely only help if you have a weak pump or run low pump speed. If you run a D5 or better at full speed I say take them out.
> 
> Sent from my IN2025 using Tapatalk


hmm I was thinking of taking mine out... maybe I'll give it a shot, 4c? really?


----------



## DaftConspiracy

ssgwright said:


> hmm I was thinking of taking mine out... maybe I'll give it a shot, 4c? really?


3-4c, bringing me down from a delta of 12c to 8c which I find much more reasonable. I'd take my CPU jet plate out too if I hadn't lapped the cold plate with it still in (since it makes the plate bow). Think I'll just open up the slit wider on that one.



mouacyk said:


> That's interesting, good find.


Thanks

Sent from my IN2025 using Tapatalk


----------



## SoldierRBT

Tested a 3080 FE 0.981v Locked at 2085MHz +1050 Memory Score: 12,575. Max temp: 54C 370W









Result not found







www.3dmark.com


----------



## ssgwright

SoldierRBT said:


> Tested a 3080 FE 0.981v Locked at 2085MHz +1050 Memory Score: 12,575. Max temp: 54C 370W
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2476052


wow 2085 at less than 1.v impressive


----------



## spongebobsq1

Anyone read this HWInfo version 6.42 can now also display the GDDR6X temperatures of NVIDIA’s new Ampere cards - Out now | igor´sLAB


----------



## spongebobsq1

I have been mining with nicehash and my gpu memory junction temps are around 96c its says at 110 it will throttle do you think 96c is to hot i have my memory oc at 915 on a evga ftw3 ultra 3080


----------



## mouacyk

GPU Memory Junction Temp after 1 hour of CODMW:









Maybe the copper heatsink is helping here?


----------



## SoldierRBT

ssgwright said:


> wow 2085 at less than 1.v impressive


The card is from a friend. Seems to be okay on the core, around 75MHz lower than my FTW3. Memory is good can bench +1100 no issue but the 370W TDP hurts a lot.


----------



## ssgwright

SoldierRBT said:


> The card is from a friend. Seems to be okay on the core, around 75MHz lower than my FTW3. Memory is good can bench +1100 no issue but the 370W TDP hurts a lot.


unfortunately my memory isn't as good as others here... I start losing performance above +800mhz


----------



## DaftConspiracy

ssgwright said:


> unfortunately my memory isn't as good as others here... I start losing performance above +800mhz


According to that ampere memory tester mine won't throw errors even at +1500mhz which I find hard to believe. Have you tried it?

Sent from my IN2025 using Tapatalk


----------



## acoustic

DaftConspiracy said:


> According to that ampere memory tester mine won't throw errors even at +1500mhz which I find hard to believe. Have you tried it?
> 
> Sent from my IN2025 using Tapatalk


Link please


----------



## DaftConspiracy

acoustic said:


> Link please


It's in this thread somewhere on this page or the previous, can't seem to find it

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

I scored 12 924 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





That's all she's got. Seems that the higher the peak cpu frequency the better the score, even through it's far from thread bound. Id play around with the CPU boost algorithm more but the afterburner frequency curve started acting up on me (over boosting or underboosting, not showing the active curve and not applying the correct curve) so I think I'm done. I do hope they fix that with the next afterburner update though.

Sent from my IN2025 using Tapatalk


----------



## josephimports

acoustic said:


> Link please


See post #3559


----------



## Hresna

mouacyk said:


> From PCB shots, I see that the Strix has 2 more GPU phases? Is that just extra padding and justification to charge more for the extra power pin then?
> Please share if anyone comes across pcb shot for the Aorus 3080 Extreme. Thanks.
> Here is the Aorus Master (funny it has all 6 POS caps, whereas cheaper Eagle OC has 1 MLC ):


If I recall, I saw Der8auer do an analysis of the strix PCB and mapped all the VRMs, there were something like 4 or 6 of them for memory.


----------



## acoustic

DaftConspiracy said:


> It's in this thread somewhere on this page or the previous, can't seem to find it
> 
> Sent from my IN2025 using Tapatalk





josephimports said:


> See post #3559


Thanks gentlemen! I'll check it out when I get home from a work trip. My 3080 FTW3 Ultra on stock cooler and initial Ampere drivers would max at +550mem. With the EVGA Hybrid cooler and the 461.33, I was able to do +650mem and see increases. I'll check the program out for sure.


----------



## DaftConspiracy

You guys are seeing decreased performance in 3dmark when it starts throwing errors? Maybe mine really is stable at +1500mhz

Sent from my IN2025 using Tapatalk


----------



## acoustic

Yes. GDDR6X performs error checking when memory is unstable; it will rarely ever blue-screen or outright crash. This is noticed in performance drops, as the memory is doing the same work twice (or more) to get it right.

I used Heaven benchmark. I paused the benchmark in a spot, and monitored the framerate with it in Windowed Mode. I changed the memory clocks and viewed the framerate to see if it went up or down.

After +650, my framerate goes down. Previously, it would go down above +550, but either the Hybrid cooler is doing a better job of cooling the vRAM or the drivers opened the legs up a little. I don't really know.

If I go off a cold boot, I can even run +700 and see gains, but after 1-2 runs, that's it - back to memory correction. I'm very tempted to get the card under a full-cover block and see if that helps, but the Hybrid does a pretty great job and my PC is dead silent.


----------



## edhutner

Is there any way to know when there is gddr6x correction or error ?


----------



## KBDE

Interesting to see the memory temps of other cards. My factory watercooled 3080 reaches 90C+ :/


----------



## blackzaru

DaftConspiracy said:


> Took the jet plate out of my GPU block and delta between the die and water dropped 4c so that's pretty sweet. Jet plates definitely only help if you have a weak pump or run low pump speed. If you run a D5 or better at full speed I say take them out.
> 
> Sent from my IN2025 using Tapatalk





mouacyk said:


> That's interesting, good find.





ssgwright said:


> hmm I was thinking of taking mine out... maybe I'll give it a shot, 4c? really?





DaftConspiracy said:


> 3-4c, bringing me down from a delta of 12c to 8c which I find much more reasonable. I'd take my CPU jet plate out too if I hadn't lapped the cold plate with it still in (since it makes the plate bow). Think I'll just open up the slit wider on that one.
> 
> 
> 
> Thanks
> 
> Sent from my IN2025 using Tapatalk



It might be important to note that he said he was running with an inversed waterflow (water entering the outlet port instead of the inlet port). Results could thus be very different for people with water flowing the other way.


----------



## DaftConspiracy

acoustic said:


> Yes. GDDR6X performs error checking when memory is unstable; it will rarely ever blue-screen or outright crash. This is noticed in performance drops, as the memory is doing the same work twice (or more) to get it right.
> 
> I used Heaven benchmark. I paused the benchmark in a spot, and monitored the framerate with it in Windowed Mode. I changed the memory clocks and viewed the framerate to see if it went up or down.
> 
> After +650, my framerate goes down. Previously, it would go down above +550, but either the Hybrid cooler is doing a better job of cooling the vRAM or the drivers opened the legs up a little. I don't really know.
> 
> If I go off a cold boot, I can even run +700 and see gains, but after 1-2 runs, that's it - back to memory correction. I'm very tempted to get the card under a full-cover block and see if that helps, but the Hybrid does a pretty great job and my PC is dead silent.


I'll have to try that with heaven. I've tried in games and haven't noticed any real changed in framerate but it's hard to monitor without a locked scene like you can get in heaven. 3dmark seemed to only increase with speeds but if minimums are going down it's not worth the bump to averages.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

blackzaru said:


> It might be important to note that he said he was running with an inversed waterflow (water entering the outlet port instead of the inlet port). Results could thus be very different for people with water flowing the other way.


Good point, that said EK does claim inverse flow only makes 1-2c difference on their GPU blocks, so a 4c difference is outside of that range. I did find a post somewhere about a guy running a 3 d5 pump loop with dual GPUs, a cpu, and some very big rads who experienced gains by removing his as well (I think his were set up the "correct" way). Will definitely need more testing to know how much each variable plays a factor.

Also, if anyone is running an EK Vector CPU block that is not lapped I highly recommend removing the jet plate just to improve contact with the IHS.

Sent from my IN2025 using Tapatalk


----------



## AngEv1L

KBDE said:


> Interesting to see the memory temps of other cards. My factory watercooled 3080 reaches 90C+ :/


I see on my kfa2 SG 3080 with bykski WB and backplate 60-62C memory and 50-52 gpu, but need better thermal paste and one more radiator (now 1x360mm)


----------



## DaftConspiracy

AngEv1L said:


> I see on my kfa2 SG 3080 with bykski WB and backplate 60-62C memory and 50-52 gpu, but need better thermal paste and one more radiator (now 1x360mm)


50 is too high, check contact between the block and die

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

KBDE said:


> Interesting to see the memory temps of other cards. My factory watercooled 3080 reaches 90C+ :/


Id take it apart and make sure the pads are making good contact. Is it a hybrid cooler or actual water block?

Sent from my IN2025 using Tapatalk


----------



## AngEv1L

DaftConspiracy said:


> 50 is too high, check contact between the block and die
> 
> Sent from my IN2025 using Tapatalk


yes, I want do it when buy better termal paste and radiator


----------



## mouacyk

AngEv1L said:


> I see on my kfa2 SG 3080 with bykski WB and backplate 60-62C memory and 50-52 gpu, but need better thermal paste and one more radiator (now 1x360mm)


Check that the standoffs are tightened. The ones around the GPU die were not on my block and I was getting temps like that, and even up to 58C on pro-longed gaming and benches. After tightening them all and confirming better contact with pressure paper, I got much better temps maxing at 40C.

I have more details in an earlier post in this thread: [Official] NVIDIA RTX 3080 Owner's Club


----------



## KBDE

DaftConspiracy said:


> Id take it apart and make sure the pads are making good contact. Is it a hybrid cooler or actual water block?
> 
> Sent from my IN2025 using Tapatalk


Water block. It's an inno3d 3080 ichill frostbite. 
Problem is the warrenty void sticker nonsense with Inno3d. Otherwise i would take it appart and check the pads.

It's easily hitting 95C while gaming. I wonder if they even took the plastic off the pads.
Before i void warranty to fix a problem that might be none i'd like to read out the individual memory chip temps.


----------



## mouacyk

KBDE said:


> Water block. It's an inno3d 3080 ichill frostbite.
> Problem is the warrenty void sticker nonsense with Inno3d. Otherwise i would take it appart and check the pads.
> 
> It's easily hitting 95C while gaming. I wonder if they even took the plastic off the pads.
> Before i void warranty to fix a problem that might be none i'd like to read out the individual memory chip temps.


FTC Notes That 'Warranty Void if Removed' Stickers Are Illegal | Digital Trends 


> *Those ‘warranty void if removed’ stickers are illegal, says the FTC*


I believe this is true in the EU, too.


----------



## Colonel_Klinck

Hmm I ran the Ampere Mem Test on my TUF OC. +1500 for 30 mins and no errors. 
Max 54c on memory temps.

An hour of playing Insurgency Sandstorm at +1000 max 70c custom loop EKWB


----------



## DaftConspiracy

Colonel_Klinck said:


> Hmm I ran the Ampere Mem Test on my TUF OC. +1500 for 30 mins and no errors. Max 54c on memory temps.
> 
> An hour of playing Insurgency Sandstorm at +1000 max 70c custom loop EKWB


I'm going to try running it while running furmark, I think the additional heat from the die will heat the chips up quite a bit

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

KBDE said:


> Water block. It's an inno3d 3080 ichill frostbite.
> Problem is the warrenty void sticker nonsense with Inno3d. Otherwise i would take it appart and check the pads.
> 
> It's easily hitting 95C while gaming. I wonder if they even took the plastic off the pads.
> Before i void warranty to fix a problem that might be none i'd like to read out the individual memory chip temps.


Looks like it's the exact same block as Alphacool's, bet they're made in the same factory. Perhaps someone with an alphacool block can check their mem temps. I'll check mine when I get home as well.

Sent from my IN2025 using Tapatalk


----------



## Colonel_Klinck

DaftConspiracy said:


> I'm going to try running it while running furmark, I think the additional heat from the die will heat the chips up quite a bit
> 
> Sent from my IN2025 using Tapatalk


Yeah the "hmm" was more +1500 being memory error free after 30 mins.

My die hits 50c as well on prolonged gaming. I'm taking it out to make changes to tube layout at the weekend so will check standoffs for tightness.


----------



## DaftConspiracy

Colonel_Klinck said:


> Yeah the "hmm" was more +1500 being memory error free after 30 mins.
> 
> My die hits 50c as well on prolonged gaming. I'm taking it out to make changes to tube layout at the weekend so will check standoffs for tightness.


From what I hear gddr6x is extremely sensitive to temperature so maybe water-cooling increases stability enough? Or maybe that test isn't heating the chips up enough since the die sits idle furring the test.

Sent from my IN2025 using Tapatalk


----------



## Shadowdane

I've never tried undervolting before.. how does this look with a small undervolt 0.975v @ ~2010-2025Mhz. I wanted to try to do the lowest voltage I could but maintain over 2000Mhz. Ignore the 1% & 0.1% Low stats I had monitoring on that disabled.


----------



## DaftConspiracy

Shadowdane said:


> I've never tried undervolting before.. how does this look with a small undervolt 0.975v @ ~2010-2025Mhz. I wanted to try to do the lowest voltage I could but maintain over 2000Mhz. Ignore the 1% & 0.1% Low stats I had monitoring on that disabled.
> 
> View attachment 2476118


Pretty good! Though Port Royal isn't a great test for real world stability and power draw. I can run Port Royal about 75mhz higher than in games and it draws less power so clocks will be more steady during runs but not as steady in games.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

KBDE said:


> Water block. It's an inno3d 3080 ichill frostbite.
> Problem is the warrenty void sticker nonsense with Inno3d. Otherwise i would take it appart and check the pads.
> 
> It's easily hitting 95C while gaming. I wonder if they even took the plastic off the pads.
> Before i void warranty to fix a problem that might be none i'd like to read out the individual memory chip temps.


Yeah somethings severely wrong with your cooling solution. Mine max out at 64c while water temp is at 38c during furmark and the mem test combined (470w consumption).

Sent from my IN2025 using Tapatalk


----------



## cennis

Did anyone try flashing a 2x8pin bios onto a 3x8pin card to cause the 3rd 8pin to bug out and pull extra power? I think this works for some 3090s?


----------



## SoldierRBT

cennis said:


> Did anyone try flashing a 2x8pin bios onto a 3x8pin card to cause the 3rd 8pin to bug out and pull extra power? I think this works for some 3090s?


I tried. I doesn’t work. Flashed the XC3 3080 BIOS into a FTW3 3080.


----------



## cennis

SoldierRBT said:


> I tried. I doesn’t work. Flashed the XC3 3080 BIOS into a FTW3 3080.


Doesn't work as in, limited to the 320W of XC3? checked power pull from wall?



https://www.3dmark.com/pr/617704


is this shunted?


----------



## SoldierRBT

Yeah card was limited to 320W, core clocks were very low too. 450W BIOS is best option


----------



## ZealotKi11er

I just my 3080 TUF which I am using to mine ETH. Its at 230W total power. Core is 47C, memory is 102C lol.


----------



## mouacyk

ZealotKi11er said:


> I just my 3080 TUF which I am using to mine ETH. Its at 230W total power. Core is 47C, memory is 102C lol.


Yep, which is why I will never buy one of these second-hand.


----------



## KBDE

Okay i've checked the card from the side, and i already see one big issue. These idiots at INNO3D didn't add a thermalpad for the powerstage up right between the memory modules. I can't say if the rest of the pads are correct or not.

It would explain the memory temps since all the heat goes into the pcb and memory next to it.


----------



## obscurehifi

DaftConspiracy said:


> I'm going to try running it while running furmark, I think the additional heat from the die will heat the chips up quite a bit
> 
> Sent from my IN2025 using Tapatalk


That's actually what I did earlier. +1500 wasn't giving me memory errors but once everything was warmed up with furmark and temps hit 100C, stabilizing around 96C, I got mem errors with the initial test bat with furmark still running. I'm going to do this again later and see what my limit is. Aorus Xtreme 3080 Waterforce with fans on auto... I did notice the mem temps dropped when I manually increased AIO fan speed but it doesn't help when the auto fan speed is driven only off of gpu temp. 

Sent from my SM-G973U using Tapatalk


----------



## ZealotKi11er

mouacyk said:


> Yep, which is why I will never buy one of these second-hand.


Yep. I would not but 3080/3090-second hand unless you know warranty can be used. 
I personally dont care since its ASUS and their RMA center is pretty good at my location and I have 6800/6900XT if it does fail. 
The crazy thing is 3080 memory uses too much power.


----------



## DaftConspiracy

SoldierRBT said:


> I tried. I doesn’t work. Flashed the XC3 3080 BIOS into a FTW3 3080.


Why not just flash the 450w bios? No game's going to hit that limit, only time spy will.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

KBDE said:


> Okay i've checked the card from the side, and i already see one big issue. These idiots at INNO3D didn't add a thermalpad for the powerstage up right between the memory modules. I can't say if the rest of the pads are correct or not.
> 
> It would explain the memory temps since all the heat goes into the pcb and memory next to it.


I wouldn't be surprised if the memory pads were cut too small or one was missing. Junction temp is hot spot so if even one part of one chip is running hot you'll see insane temps.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

obscurehifi said:


> That's actually what I did earlier. +1500 wasn't giving me memory errors but once everything was warmed up with furmark and temps hit 100C, stabilizing around 96C, I got mem errors with the initial test bat with furmark still running. I'm going to do this again later and see what my limit is. Aorus Xtreme 3080 Waterforce with fans on auto... I did notice the mem temps dropped when I manually increased AIO fan speed but it doesn't help when the auto fan speed is driven only off of gpu temp.
> 
> Sent from my SM-G973U using Tapatalk


I ran furmark are 470w with the 30 min test and still no errors at +1500mhz. Mem temps never left the 60s. I wonder how much my backplate is helping. 

That reminds me, anyone running the 3080 tuf (and presumably other model) back plates, reuse the memory thermal pads on the backplate. The ones EK includes will only make contact with the rear memory modules on 3090s. They cheaped out and didn't include any for the 3080. I recommend stacking thermal pads for the back of the die also as those were too thin as well for some reason.

Sent from my IN2025 using Tapatalk


----------



## obscurehifi

I did some further tests with Furmark and mem test using the ZEED amphere mem tool. I only used the 1initial.bat and stepped from +0, +1000, 1100, 1200, 1300, 1400, 1450, and 1500. I was error free until +1400. 1450 started some errors and Furmark actually crashed at +1500 during the mem test.










I plotted the GPU and Mem temps through the run with and the mem clock temperature reached a max of 102C. I started the amphere mem tests where the green line goes down to +1000 memory offset, where I set the fan speed back to auto. Before that, I maxed my fans out to see that the mem temp comes down with fan speed and it did a little. Through the tests I found it interesting that the mem temps actually dropped through the start of the mem test then rose towards the end. You can see a peak in temps when my mem offset was at +1200 and mem temps started to drop afterwards by a little. Doesn't look like my AIO does much to cool the memory temp.


----------



## DaftConspiracy

obscurehifi said:


> I did some further tests with Furmark and mem test using the ZEED amphere mem tool. I only used the 1initial.bat and stepped from +0, +1000, 1100, 1200, 1300, 1400, 1450, and 1500. I was error free until +1400. 1450 started some errors and Furmark actually crashed at +1500 during the mem test.
> View attachment 2476186
> 
> 
> 
> I plotted the GPU and Mem temps through the run with and the mem clock temperature reached a max of 102C. I started the amphere mem tests where the green line goes down to +1000 memory offset, where I set the fan speed back to auto. Before that, I maxed my fans out to see that the mem temp comes down with fan speed and it did a little. Through the tests I found it interesting that the mem temps actually dropped through the start of the mem test then rose towards the end. You can see a peak in temps when my mem offset was at +1200 and mem temps started to drop afterwards by a little. Doesn't look like my AIO does much to cool the memory temp.
> View attachment 2476187


Yeah that's the issues with hybrid coolers, vram and vrm has basically just a flat aluminum plate to cool them, and the fan on the card itself to cool the plate. Older generations got away with it but with the insane power consumption of gddr6x I'm not surprised it needs more. Impressive that you were able to run +1400mhz at those temps though. Makes me wonder just how far mine could go if it wasn't bios limited.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

I confirmed with the shadow of the tomb raider benchmark, I have no change to minimum frame rates going from +600 to +1500

Sent from my IN2025 using Tapatalk


----------



## obscurehifi

DaftConspiracy said:


> Yeah that's the issues with hybrid coolers, vram and vrm has basically just a flat aluminum plate to cool them, and the fan on the card itself to cool the plate. Older generations got away with it but with the insane power consumption of gddr6x I'm not surprised it needs more. Impressive that you were able to run +1400mhz at those temps though. Makes me wonder just how far mine could go if it wasn't bios limited.
> 
> Sent from my IN2025 using Tapatalk


Is an AIO considered Hybrid still? I thought that was when they combined an AIO along with a fan/heatsink on the card.

Before this test, I have find most of my highest benchmarks seem to be with memory at +1200 which, interestingly, had the highest Tj temps on my chart above. I usually run my benchmarks with the radiator fans at max because I'm able to get a little more performance. Makes sense now that I know the Tj temp comes down a little with the fans on high, which is a little odd since the water cooler isn't connected to the memory. This makes me want to stick a heatsink on the backplate but there isn't a lot of room to my big CPU air cooler but I think it could work. 

Sent from my SM-G973U using Tapatalk


----------



## ssgwright

i was able to get my card stable in port at 2200mhz but for some reason it scored lower than my 2190 run... hmm maybe it's a memory issue

2190: scored 12,980
2205: scored 12,865


----------



## DaftConspiracy

obscurehifi said:


> Is an AIO considered Hybrid still? I thought that was when they combined an AIO along with a fan/heatsink on the card.
> 
> Before this test, I have find most of my highest benchmarks seem to be with memory at +1200 which, interestingly, had the highest Tj temps on my chart above. I usually run my benchmarks with the radiator fans at max because I'm able to get a little more performance. Makes sense now that I know the Tj temp comes down a little with the fans on high, which is a little odd since the water cooler isn't connected to the memory. This makes me want to stick a heatsink on the backplate but there isn't a lot of room to my big CPU air cooler but I think it could work.
> 
> Sent from my SM-G973U using Tapatalk


You don't have a fan on the card to cool the ram or vrm?

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

ssgwright said:


> i was able to get my card stable in port at 2200mhz but for some reason it scored lower than my 2190 run... hmm maybe it's a memory issue
> 
> 2190: scored 12,980
> 2205: scored 12,865


Port Royal has a lot of run to run variance it seems. I ran into that as well going from 2130mhz to 2160mhz.

Sent from my IN2025 using Tapatalk


----------



## Mystic33

Hi, i share with you guys my last result after 5 tries on air @EVGA3080 _FTW3 ULTRA GAMING_ with 450w bios.




















I scored 18 052 in Time Spy


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com





I need more headroom, avg clock still pretty low @2134mhz so far, on test 1 custom curve @1.062 and on test 2 custom curve @1.050 i hit the power limit several times. This unit is capable to go over 2200mhz @1.093v with no troubles if i can get at least 500 or 550w of headroom aviable.

Before i go under water or shunt mod is there any bios above 500w ?


----------



## marcoschaap

Hypothetically speaking, can I flash a 450w BIOS to a 2x 8pin PCB? Or does that have a physical or some soft limitation?


----------



## Arni90

DaftConspiracy said:


> 50 is too high, check contact between the block and die
> 
> Sent from my IN2025 using Tapatalk


50 isn't high if he's running a single 360mm radiator, it all depends on water temperature.



Colonel_Klinck said:


> Hmm I ran the Ampere Mem Test on my TUF OC. +1500 for 30 mins and no errors.
> Max 54c on memory temps.
> 
> An hour of playing Insurgency Sandstorm at +1000 max 70c custom loop EKWB


I have an alphacool block with shunt-modded reference PCB, and I'm now wondering if the stock thermal pads for memory are a bit thin.

What software you're running also matters a lot in my experience:
Ampere memory test barely hits 65C memory with GPU temp at 30C.
Quake 2 RTX hits 95C, I generally find that game to hit power limits the hardest as well.


----------



## DaftConspiracy

Arni90 said:


> 50 isn't high if he's running a single 360mm radiator, it all depends on water temperature.
> 
> 
> 
> I have an alphacool block with shunt-modded reference PCB, and I'm now wondering if the stock thermal pads for memory are a bit thin.
> 
> What software you're running also matters a lot in my experience:
> Ampere memory test barely hits 65C memory with GPU temp at 30C.
> Quake 2 RTX hits 95C, I generally find that game to hit power limits the hardest as well.


Don't remember but I think he was running stock power limit so imo thats still high even for a single 360mm unless it's a slim one. I did think of that once I realized the test relied on memory being up to temp, I recommend running furmark while running the test. I had to enable extreme burn in and post fx to consistently maintain maximum power consumption. I think my memory topped out at 69c after an hour.

A good way to tell if the thermal pads are thick enough is if you take the block off they should be curling up where they don't make contact with the chips, but perfectly flat where the chips make contact. Assuming you have the same thin blue pads EK uses, the thick & soft ones don't really curl but the contact patch should be very visible on those.

Sent from my IN2025 using Tapatalk


----------



## leegoocrap

marcoschaap said:


> Hypothetically speaking, can I flash a 450w BIOS to a 2x 8pin PCB? Or does that have a physical or some soft limitation?


Pretty much you can flash it, but it's not going to do what you want it to. It will report 150/150/150, but since you only have 2 plugs you're only getting 150/150... good chance it's a downgrade instead of upgrade.

Not sure if something like shunt modding would change anything about that


----------



## obscurehifi

DaftConspiracy said:


> You don't have a fan on the card to cool the ram or vrm?
> 
> Sent from my IN2025 using Tapatalk


Nope.








AORUS GeForce RTX™ 3080 XTREME WATERFORCE 10G (rev. 1.0) Key Features | Graphics Card - GIGABYTE Global


Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!




www.gigabyte.com






Sent from my SM-G973U using Tapatalk


----------



## Arni90

DaftConspiracy said:


> A good way to tell if the thermal pads are thick enough is if you take the block off they should be curling up where they don't make contact with the chips, but perfectly flat where the chips make contact. Assuming you have the same thin blue pads EK uses, the thick & soft ones don't really curl but the contact patch should be very visible on those.


There was no imprint in the thermal pads, and just adding a 0.5mm thermal pad on top dropped memory temps by about 10C, though at the cost of worse GPU contact and mounting pressure.


----------



## DaftConspiracy

obscurehifi said:


> Nope.
> 
> 
> 
> 
> 
> 
> 
> 
> AORUS GeForce RTX™ 3080 XTREME WATERFORCE 10G (rev. 1.0) Key Features | Graphics Card - GIGABYTE Global
> 
> 
> Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!
> 
> 
> 
> 
> www.gigabyte.com
> 
> 
> 
> 
> 
> 
> Sent from my SM-G973U using Tapatalk


Oh that's bizarre, I wonder if the coldplate makes contact with that plate covering the ram. I'd certainly hope so, but given the temps you're getting it looks like it might not. It'd probably be worth it to make sure you have thermal pads between the back plate and the back of pcb where the ram is mounted to try and improve cooling.

Sent from my IN2025 using Tapatalk


----------



## acoustic

Something has to be wrong for a full-cover block having that high temps.

My EVGA Hybrid cooler while gaming for hours keeps memory temps no higher than 65c. I would try contacting Gigabyte; maybe the cooler is defective or not making proper contact.

There's no way that should be acceptable.


----------



## obscurehifi

There is one pad on the backside of the card. Hard to take a picture of but it's about an inch forward of the power connectors. Should there be more pads? 

Sent from my SM-G973U using Tapatalk


----------



## obscurehifi

Finally found a tear down of my card Aorus Waterforce. Thoughts on the construction of waterblock and location of heat sink pads? I'm getting 102 degrees C on the Tj memory junction temp as shown above. Does anything in this construction explain why?

Tear down starts at 9:54. 







Sent from my SM-G973U using Tapatalk


----------



## man from atlantis

obscurehifi said:


> Finally found a tear down of my card Aorus Waterforce. Thoughts on the construction of waterblock and location of heat sink pads? I'm getting 102 degrees C on the Tj memory junction temp as shown above. Does anything in this construction explain why?
> 
> Tear down starts at 9:54.
> 
> 
> 
> 
> 
> 
> 
> Sent from my SM-G973U using Tapatalk


No Gigabyte 3080 line up has VRAM thermalpads between PCB and backplate, Aorus Xtreme has some pads but for some reason Gigabyte's chosen to use them on back of the VRM instead of VRAMs. Gigabyte thermalpads are also low quality that under high temperatures due to chemical reaction they dissolves and leaks fuel, literally fuel not oil.


----------



## obscurehifi

man from atlantis said:


> No Gigabyte 3080 line up has VRAM thermalpads between PCB and backplate, Aorus Xtreme has some pads but for soem reason Gigabyte cohse to use them back of the VRM instead of VRAMs.


I just saw on the video they have a thermal imaging if the backside of the board and it's hottest where they have the thermal pad. Perhaps they feel the waterblock is cooling the vram enough from the front side. That's shown at 19 minutes in the vid.

Would there be a benefit to me adding thermal pad between the back of the card and the back plate? 

Sent from my SM-G973U using Tapatalk


----------



## man from atlantis

obscurehifi said:


> I just saw on the video they have a thermal imaging if the backside of the board and it's hottest where they have the thermal pad. Perhaps they feel the waterblock is cooling the vram enough from the front side. That's shown at 19 minutes in the vid.
> 
> Would there be a benefit to me adding thermal pad between the back of the card and the back plate?
> 
> Sent from my SM-G973U using Tapatalk


Your 102C tjmax is higher than many aircooled 3080s. I get 100C if i set fans to 40%, +1250MHz VRAM, 7/24 mining and i have 2 GPUs in my case. Mining is the worst case.

Thermal pads are definitely helping, otherwise it's just cosmetics.


----------



## rjrusek

PSU (750W) Fan speeding up and being really load when system is under load. Currently on water for CPU (I7-10700) and GPU (RTX 3080 Gaming OC). This happens when playing Control on max setting for at least 10min. 

Just want to ask around if this is normal or should I replace my PSU?


----------



## noxyd

Finaly managed to break 20,000 on Time Spy, not by much though!








I scored 16 659 in Time Spy


Intel Core i7-9700K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





+100 core
+1200 memory
+100% voltage

3080 FTW3
430-450W sustained, GPU reaches 60C after 2 runs...

@mouacyk : I did check standoffs, they looked ok. Contact looked also good. Unfortunately no change for me. Used EK thermal paste to repaste (was out of Kryonaut).

I saw people here at 40-45C on water so I was worried that my block was not properly mounted.
Double checked everything, and most importantly took the time to compare with comparable rigs.

My loop is just not designed for so many watts, I'm in the 55-60 range under 450W gpu load.
I have a Coolermaster NR200 ITX case with a 280 rad + 240 rad (both 30mm), with a 9700K @ 5,1Ghz in the loop.
my bottom rad has 0 clearance so fans are pulling directly against the block, which I'm sure reduces cooling efficiency a lot.
see build here

I've checked numerous small form factor cases on reddit and the 50-60 C range seems to be a realistic one for ITX cases w dual rad configurations.
I think I'm going to settle with an undervolt of 1V @ 2040mhz which takes me to 380W max.


----------



## Arni90

Arni90 said:


> There was no imprint in the thermal pads, and just adding a 0.5mm thermal pad on top dropped memory temps by about 10C, though at the cost of worse GPU contact and mounting pressure.


The (obvious) solution was to add thermal paste between the memory, thermal pad, and waterblock.


----------



## DaftConspiracy

obscurehifi said:


> Finally found a tear down of my card Aorus Waterforce. Thoughts on the construction of waterblock and location of heat sink pads? I'm getting 102 degrees C on the Tj memory junction temp as shown above. Does anything in this construction explain why?
> 
> Tear down starts at 9:54.
> 
> 
> 
> 
> 
> 
> 
> Sent from my SM-G973U using Tapatalk


Lol at that random thermal pad. It doesn't make contact with any of the hot spots on the back of the card, I don't know why they even bothered. There should be 3 pads around the die where the memory chips are, might be a good idea to put them on the vrm mosfets too while you're in there.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

rjrusek said:


> PSU (750W) Fan speeding up and being really load when system is under load. Currently on water for CPU (I7-10700) and GPU (RTX 3080 Gaming OC). This happens when playing Control on max setting for at least 10min.
> 
> Just want to ask around if this is normal or should I replace my PSU?


Are you running everything stock? If so you should be fine, but it is putting a lot of load on the PSU so it is going to ramp up the fan. Of you run increased power limits on cpu and/or GPU a new PSU might not be a bad idea.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

obscurehifi said:


> I just saw on the video they have a thermal imaging if the backside of the board and it's hottest where they have the thermal pad. Perhaps they feel the waterblock is cooling the vram enough from the front side. That's shown at 19 minutes in the vid.
> 
> Would there be a benefit to me adding thermal pad between the back of the card and the back plate?
> 
> Sent from my SM-G973U using Tapatalk


The reason those areas are so hot is because those components are sinking heat to that thin piece of copper that is covered in a plastic shroud, holding the heat in. The die area is cooler because the actual coldplate makes contact with it. They're relying on poor contact between that copper plate and the cold plate to cool the memory and vrm by the looks of it. That's a really poor design.

Sent from my IN2025 using Tapatalk


----------



## SPL Tech

This is why you buy a custom loop and skip those cheap overpriced hybrid cards. My TJ temp is 90C MAX on Cyberpunk 2077 running at 4k and I have +1050 MHz VRAM overclock and I have a shunt mod so my card produces way more heat than the OEM card does.


----------



## DaftConspiracy

SPL Tech said:


> This is why you buy a custom loop and skip those cheap overpriced hybrid cards. My TJ temp is 90C MAX on Cyberpunk 2077 running at 4k and I have +1050 MHz VRAM overclock and I have a shunt mod so my card produces way more heat than the OEM card does.


Ek really took the cake with memory cooling on these cards then. Gaming I average 430-440w and memory junction stays at 54c at +1500mhz. In furmark I'll hold 470w and junction will top out at 69c I think it was. Wonder why mine stays so much cooler than everyone else's. Thinner thermal pads maybe? Higher quality pads? Because I took my jet plate out?

I gotta say with most cards averaging ram temps so close to operational limits I'm surprised there haven't been nearly as many ram failures as there were with the initial 2080 tis.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

Arni90 said:


> There was no imprint in the thermal pads, and just adding a 0.5mm thermal pad on top dropped memory temps by about 10C, though at the cost of worse GPU contact and mounting pressure.


Hm, perhaps you could try stretching out the pads to make them closer to .25mm. Also do you have pads on the back of the card to transfer heat through the pcb into the backplate?

Sent from my IN2025 using Tapatalk


----------



## josephimports

DaftConspiracy said:


> Ek really took the cake with memory cooling on these cards then. Gaming I average 430-440w and memory junction stays at 54c at +1500mhz. In furmark I'll hold 470w and junction will top out at 69c I think it was. Wonder why mine stays so much cooler than everyone else's. Thinner thermal pads maybe? Higher quality pads? Because I took my jet plate out?
> 
> I gotta say with most cards averaging ram temps so close to operational limits I'm surprised there haven't been nearly as many ram failures as there were with the initial 2080 tis.
> 
> Sent from my IN2025 using Tapatalk


I'm averaging similar temps on the Barrow TUF block. I had originally ordered the ek block but found the Barrow in stock and over a hundred dollars cheaper when adding the backplate. The ek block had a two week delay at the time. I opted to use the original thermal pads vs the included Barrow pads as they appeared of higher quality.


----------



## KSIMP88

I hate everyone in this thread. You all suck.



Also, I can't wait to become one of you and, subsequently, love you all. This card is a pain in the butt to source without paying some stupid price! lol


----------



## Felgor

Comparing Temps on water is useless without ambient and fan/ rad setup info. Not everyone is running max fan and pump speeds with 18c ambient temps or an open window in a snowy winter.

4 days ago it was 40c outside and 28c ambient in this room, today it has been raining all day and still 23c ambient.

My rig is built for almost silence at less than 1m while maintaining a heavy OC on GPU and CPU, in 22 to 30c ambient room temps.

So without all the environmental data, I would go crazy chasing a 5 or 10c difference.

After all, super low noise, extreme oc or the abilty to do both is the beauty of a custom loop.

Understanding how your loop performs in different situations takes time but certainly helps interpreting non scoentific data from across the globe.

All that said, idle and max temps below after at least 5 back to back Timespy runs. Memory +1284, custom curve similar to +160 
3x 360mm 50mm rads with Noiseblocker pwm fans, D5 pump. Heatkiller blocks on 3080 and AMD 5800x at OC settings.

Heatkiller 3080 backplate has machined grooves to match the heatpad locations. It certainly gets hot and I lack airflow around the back of the card.



















IDLE 23c Ambient









low rpm fan (same as idle) max temps, ambient 23c










max fan rpm, max temps, ambient 23c


----------



## undertaker2k8

First post, long time lurker. These scores decent? Card at 105% power limit, +65 vcore , +800 ram, 95% voltage . Zotac 3080 and Asus 570/5800 on PBO+175 and DDR4 3600 CL18.


----------



## duckworld

Anyone have a link or tips on where thermal pads should be placed on the XC3? Heard that maybe there were one or two spots it comes without them stock. Would also appreciate recs for pads. Thanks


----------



## SPL Tech

DaftConspiracy said:


> Ek really took the cake with memory cooling on these cards then. Gaming I average 430-440w and memory junction stays at 54c at +1500mhz. In furmark I'll hold 470w and junction will top out at 69c I think it was. Wonder why mine stays so much cooler than everyone else's. Thinner thermal pads maybe? Higher quality pads? Because I took my jet plate out?
> 
> I gotta say with most cards averaging ram temps so close to operational limits I'm surprised there haven't been nearly as many ram failures as there were with the initial 2080 tis.
> 
> Sent from my IN2025 using Tapatalk


What resolution are you running? i wonder if there is a relationship between VRM use and temps. Like running 4k and using all 10 gb vs running 1080p and using maybe 7gb. I find 10 GB is not quite enough. Most games use all 10 GB and sometimes I see textures loading on the screen which tells me I have too little VRM for 4k. Nvidia completely screwed us with 10 GB on this card. It should have16 GB.


----------



## EarlZ

Is the Gigabyte 3080 Master Rev2 worth the indefinite and possibly even a higher price tag due to shortage than the Rev1 master, is the PCB design really capable of maximizing the 3X8PIN


----------



## Dylanshock

hello everyone i am new .. i read the guide how to flash bios 3080 .. everything went smoothly with my inno3d frostbite ... but i tried different bios and none can bring me to 370w. the limit of my card is 340w..I have tried bios asus tuf and also gigabyte waterforce 370w..but nobody can bring me to 370w..i am thinking that maybe this scehda is hardware locked at maximum 340w ... the only bios that uo tried pushed is that evga xc3 ftw from 400w..li the card hit 380w but I don't like to use bios 3 pin on 2 pin board..in the guide I saw that inno3d frostbite uses pcb reference .. I know that FE uses a customized pcb with two more phases..but the layout is very similar to reference..I tried to flash FE bios but ID error was not possible ... maybe you can't flash FE bios on non FE cards..who can help me to touch the 370w?


----------



## Dylanshock

ah my configuration
R5 5600x pbo 4850mhz IF 1900mhz
Ram crucial ballistix OC 3800mhz cl16
Gigabyte aorus x570i bios f33a latest
Custom loop dual rad 240mm corsair xr5 corsair 
Vga run in game at max 52 degrees After 2 hours in control Rtx 1440p ultra . 1950mhz at 0.925mv and memory +1000mhz..tjunction 74 degrees 
In game vga 335w and sometimes it touches the limiters 340w..


----------



## Felgor

Dylanshock said:


> ah my configuration
> R5 5600x pbo 4850mhz IF 1900mhz
> Ram crucial ballistix OC 3800mhz cl16
> Gigabyte aorus x570i bios f33a latest
> Custom loop dual rad 240mm corsair xr5 corsair
> Vga run in game at max 52 degrees After 2 hours in control Rtx 1440p ultra . 1950mhz at 0.925mv and memory +1000mhz..tjunction 74 degrees
> In game vga 335w and sometimes it touches the limiters 340w..


I have a suspicion that the +% in the Vbios is just a safety reserve...I am running the Zotac AMP Holo bios which is 340w, 374w at +10% and I top out at 342w. Highest TDP % I have seen was 104% when using 2 PSU's. I would love a full 390w to play with.


----------



## DaftConspiracy

Felgor said:


> Comparing Temps on water is useless without ambient and fan/ rad setup info. Not everyone is running max fan and pump speeds with 18c ambient temps or an open window in a snowy winter.
> 
> 4 days ago it was 40c outside and 28c ambient in this room, today it has been raining all day and still 23c ambient.
> 
> My rig is built for almost silence at less than 1m while maintaining a heavy OC on GPU and CPU, in 22 to 30c ambient room temps.
> 
> So without all the environmental data, I would go crazy chasing a 5 or 10c difference.
> 
> After all, super low noise, extreme oc or the abilty to do both is the beauty of a custom loop.
> 
> Understanding how your loop performs in different situations takes time but certainly helps interpreting non scoentific data from across the globe.
> 
> All that said, idle and max temps below after at least 5 back to back Timespy runs. Memory +1284, custom curve similar to +160
> 3x 360mm 50mm rads with Noiseblocker pwm fans, D5 pump. Heatkiller blocks on 3080 and AMD 5800x at OC settings.
> 
> Heatkiller 3080 backplate has machined grooves to match the heatpad locations. It certainly gets hot and I lack airflow around the back of the card.
> 
> View attachment 2476344
> 
> 
> View attachment 2476345
> 
> 
> IDLE 23c Ambient
> View attachment 2476349
> 
> 
> low rpm fan (same as idle) max temps, ambient 23c
> View attachment 2476350
> 
> 
> 
> max fan rpm, max temps, ambient 23c
> View attachment 2476351


My fans top out at about 1100rpm in game with my card drawing 400-430w and cpu drawing 100w. Rads were mentioned earlier but I run a 360x50mm and 360x55mm with an alphacool vp755 pump in a room thats about 68-69f.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

undertaker2k8 said:


> First post, long time lurker. These scores decent? Card at 105% power limit, +65 vcore , +800 ram, 95% voltage . Zotac 3080 and Asus 570/5800 on PBO+175 and DDR4 3600 CL18.


CPU score is under performing. What ram are you using? I recommend running pbo +150mhz with an all core curve optimization at -10 to -15 depending on what's stable. What's your cooling solution?

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

SPL Tech said:


> What resolution are you running? i wonder if there is a relationship between VRM use and temps. Like running 4k and using all 10 gb vs running 1080p and using maybe 7gb. I find 10 GB is not quite enough. Most games use all 10 GB and sometimes I see textures loading on the screen which tells me I have too little VRM for 4k. Nvidia completely screwed us with 10 GB on this card. It should have16 GB.


I run 3840x1600 with ultra 4k textures in all my games. Usually I don't see higher than 7gb usage except for resident evil, which will go past 10 and start to use system memory as vram.

Sent from my IN2025 using Tapatalk


----------



## eliwankenobi

EarlZ said:


> Is the Gigabyte 3080 Master Rev2 worth the indefinite and possibly even a higher price tag due to shortage than the Rev1 master, is the PCB design really capable of maximizing the 3X8PIN


Gigabyte says it will behave like any other Master 3080, just that now because of cost reduction they will be using the Xtreme PCB... which opens the possibility of flashing the Master to an Xtreme vBIOS.. I’d say it’s worth it if within reach. If you can buy a Rev.1 Master now, I’d say go for it. Unless you are sure you can get a Rev.2 soon


----------



## DaftConspiracy

EarlZ said:


> Is the Gigabyte 3080 Master Rev2 worth the indefinite and possibly even a higher price tag due to shortage than the Rev1 master, is the PCB design really capable of maximizing the 3X8PIN


Without shunt mods you won't be anywhere near the limits of the 8 pin connections

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

Dylanshock said:


> hello everyone i am new .. i read the guide how to flash bios 3080 .. everything went smoothly with my inno3d frostbite ... but i tried different bios and none can bring me to 370w. the limit of my card is 340w..I have tried bios asus tuf and also gigabyte waterforce 370w..but nobody can bring me to 370w..i am thinking that maybe this scehda is hardware locked at maximum 340w ... the only bios that uo tried pushed is that evga xc3 ftw from 400w..li the card hit 380w but I don't like to use bios 3 pin on 2 pin board..in the guide I saw that inno3d frostbite uses pcb reference .. I know that FE uses a customized pcb with two more phases..but the layout is very similar to reference..I tried to flash FE bios but ID error was not possible ... maybe you can't flash FE bios on non FE cards..who can help me to touch the 370w?


All 2x8 pin cards have the same 350w power limit due to the load balancing. Only way to increase it is with shunt mods. The 3x8 pin bios actually reduces power limit because it thinks the non-existent 3rd connector is drawing as much power as the 1st.

Sent from my IN2025 using Tapatalk


----------



## Dylanshock

Felgor said:


> I have a suspicion that the +% in the Vbios is just a safety reserve...I am running the Zotac AMP Holo bios which is 340w, 374w at +10% and I top out at 342w. Highest TDP % I have seen was 104% when using 2 PSU's. I would love a full 390w to play with.


 why do you use 2 psu? i have a corsair sf750w which has three pcie lines ..


----------



## Dylanshock

DaftConspiracy said:


> All 2x8 pin cards have the same 350w power limit due to the load balancing. Only way to increase it is with shunt mods. The 3x8 pin bios actually reduces power limit because it thinks the non-existent 3rd connector is drawing as much power as the 1st.
> 
> Sent from my IN2025 using Tapatalk


therefore all the bios of the cards are blocked at maximum 340 / 350w .. except the FE .. is it possible to flash FE bios on non-FE cards?


----------



## Felgor

Dylanshock said:


> hello everyone i am new .. i read the guide how to flash bios 3080 .. everything went smoothly with my inno3d frostbite ... but i tried different bios and none can bring me to 370w. the limit of my card is 340w..I have tried bios asus tuf and also gigabyte waterforce 370w..but nobody can bring me to 370w..i am thinking that maybe this scehda is hardware locked at maximum 340w ... the only bios that uo tried pushed is that evga xc3 ftw from 400w..li the card hit 380w but I don't like to use bios 3 pin on 2 pin board..in the guide I saw that inno3d frostbite uses pcb reference .. I know that FE uses a customized pcb with two more phases..but the layout is very similar to reference..I tried to flash FE bios but ID error was not possible ... maybe you can't flash FE bios on non FE cards..who can help me to touch the 370w?


Was there a performance at 380w with the xc3 bios?



DaftConspiracy said:


> My fans top out at about 1100rpm in game with my card drawing 400-430w and cpu drawing 100w. Rads were mentioned earlier but I run a 360x50mm and 360x55mm with an alphacool vp755 pump in a room thats about 68-69f.
> 
> Sent from my IN2025 using Tapatalk


Thanks that helps put temps in perspective. I wasn't talking about anyone in particular, just in general...on a few forums and threads.



Dylanshock said:


> why do you use 2 psu? i have a corsair sf750w which has three pcie lines ..


My 12v rails on the GPU were dropping to 12.5 or 12.6v , so I ran a seperate psu for the pcie power to test. I've hated the Corsair AX860i ever since I discovered its crap fan algorithm. 

I swapped back to my old Zalman 860w and the 12v rails are better and the low rpm fan is quieter than the corsair ramping a fan on and off at certain loads every 30 seconds.


----------



## DaftConspiracy

Dylanshock said:


> therefore all the bios of the cards are blocked at maximum 340 / 350w .. except the FE .. is it possible to flash FE bios on non-FE cards?


Correct, as of now its not possible to flash an FE bios on another card because the FE uses a "different" die (just appears to have a slightly different name) so nvflash won't allow it. I've seen some modified 3090 bios floating around, would've been nice to see modified 3080 bios but looks like everyone's just shunt modding instead of trying to make those.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

Felgor said:


> Was there a performance at 380w with the xc3 bios?
> 
> 
> 
> Thanks that helps put temps in perspective. I wasn't talking about anyone in particular, just in general...on a few forums and threads.
> 
> 
> 
> My 12v rails on the GPU were dropping to 12.5 or 12.6v , so I ran a seperate psu for the pcie power to test. I've hated the Corsair AX860i ever since I discovered its crap fan algorithm.
> 
> I swapped back to my old Zalman 860w and the 12v rails are better and the low rpm fan is quieter than the corsair ramping a fan on and off at certain loads every 30 seconds.


Would've actually had a performance decrease because instead of running 380w he was really running something like 280-300w because the drivers were duplicating the draw on connector 1 to connector 3.

Sent from my IN2025 using Tapatalk


----------



## blitzwind87

hi guys, anyone here using colorful igame rtx 3080 vulcan x oc? mind to share your oc bios? i want to try to flash it to my vulcan oc model


----------



## EarlZ

eliwankenobi said:


> Gigabyte says it will behave like any other Master 3080, just that now because of cost reduction they will be using the Xtreme PCB... which opens the possibility of flashing the Master to an Xtreme vBIOS.. I’d say it’s worth it if within reach. If you can buy a Rev.1 Master now, I’d say go for it. Unless you are sure you can get a Rev.2 soon


So this means the Rev2 is really the better choice, I got lucky because the stock that arrived is rev2.


----------



## Dylanshock

Bios of xc3 work at 380w ..but not good I saw the consumption with cpuz..but the clock remains stuck at 1710mhz ... I tried to undervolt but nothing happens..the strange thing is that pcie takes only 40w the rest 340w from pcie ... in gpuz I saw how if i had pins 1-2-3 .. pin 1-3 had same watts 115w and pin 2 110..then pin 1-3 was more than 230w .. the psu on that pin had a drop at 11.9v ... normally my corsair at full load is 12v


----------



## Colonel_Klinck

So I changed the shunts on my TUF from 008 shunts stacked for 005 and a 010 on the PCI. Managed to break though 19k with 19024 but still hitting power limits. That is +1300 mem









I scored 19 024 in Time Spy


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## undertaker2k8

DaftConspiracy said:


> CPU score is under performing. What ram are you using? I recommend running pbo +150mhz with an all core curve optimization at -10 to -15 depending on what's stable. What's your cooling solution?
> 
> Sent from my IN2025 using Tapatalk


Secondary timings make a world of difference, who knew? Lol
That reddit thread was spot on

__
https://www.reddit.com/r/ryzen/comments/kbumil
 , only halfway done but much happier now


----------



## undertaker2k8

Some further tweaking on the GPU, gonna leave this for now...


----------



## DaftConspiracy

undertaker2k8 said:


> Secondary timings make a world of difference, who knew? Lol
> That reddit thread was spot on
> 
> __
> https://www.reddit.com/r/ryzen/comments/kbumil
> , only halfway done but much happier now
> View attachment 2476389


Interesting, I might have to play around with mine. Are you running 2 sticks or 4?

Sent from my IN2025 using Tapatalk


----------



## Colonel_Klinck

So while I was changing the shunts I checked the standoffs and 3 weren't tightened right down. As suggested added thermal pads to the back of the die and used the thermal pads off the OEM backplate on the EK backplate for the memory. Temps went after an hour of gaming from 50c core to 44c core and memory from 70c to 65c. My fans aren't ramped up on MB bios I use for gaming due to noise.


----------



## Dylanshock

this my result with 340w inno3d and ryzen 5600x..undervolt gpu for optimize clock...memory 1300mhz.....
i think with 5900x the result can be better
https://www.3dmark.com/3dm/57552558


----------



## leegoocrap

I scored 17 362 in Time Spy


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com





Here is my Timespy run. Still have some more tuning to do


----------



## Dylanshock

leegoocrap said:


> I scored 17 362 in Time Spy
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Here is my Timespy run. Still have some more tuning to do
> View attachment 2476423


Nice!! Cpu score is very good...vga more or less the same ad mine..wich vga brand?


----------



## leegoocrap

Dylanshock said:


> Nice!! Cpu score is very good...vga more or less the same ad mine..wich vga brand?


it's an EVGA xc3 (painted shunts) OC wasn't at max, but it's hot today and didn't want to deal with opening the case up and pointing a fan at it 

5800x PBO (4x scalar +200 max)
Viper 4000 B-die, I'm about 70% tuned stable... finishing secondaries about to start on terts... have been wasting a bunch of time trying to get 3800/1900fclk on this cpu but so far has eluded me 1867 - perfect. 1900 - worked a fair bit just to get it to post, but even then immediately in windows crackling speakers. Stinks my old 3600 (og) was a champ at 3800/1900.


----------



## Dylanshock

leegoocrap said:


> it's an EVGA xc3 (painted shunts) OC wasn't at max, but it's hot today and didn't want to deal with opening the case up and pointing a fan at it
> 
> 5800x PBO (4x scalar +200 max)
> Viper 4000 B-die, I'm about 70% tuned stable... finishing secondaries about to start on terts... have been wasting a bunch of time trying to get 3800/1900fclk on this cpu but so far has eluded me 1867 - perfect. 1900 - worked a fair bit just to get it to post, but even then immediately in windows crackling speakers. Stinks my old 3600 (og) was a champ at 3800/1900.


how many watts did you gain with shunt? I travel very well with 5600x 1900fclk ... the rams are crucial ballistix 2x8gb 3600mhz ... oc 3800 cl 16-18-18 ... in 1966 if the pc does not start ... will it be the ram or the cpu ??


----------



## DaftConspiracy

Colonel_Klinck said:


> So while I was changing the shunts I checked the standoffs and 3 weren't tightened right down. As suggested added thermal pads to the back of the die and used the thermal pads off the OEM backplate on the EK backplate for the memory. Temps went after an hour of gaming from 50c core to 44c core and memory from 70c to 65c. My fans aren't ramped up on MB bios I use for gaming due to noise.


Interesting, this is the first I've heard of a standoff issue with an EK product, the ones I check on mine were pretty tight. Good to hear the backplate thermal pad tricked for you too, what card do you use? I'm going to email ek and suggest they add additional pads for the 3080s. Not including them is just lazy, it's a $50-$70 aluminum plate. Not like it'll hurt their profit margins.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

Dylanshock said:


> how many watts did you gain with shunt? I travel very well with 5600x 1900fclk ... the rams are crucial ballistix 2x8gb 3600mhz ... oc 3800 cl 16-18-18 ... in 1966 if the pc does not start ... will it be the ram or the cpu ??


That's an IF limit you're hitting. Make sure you have the latest bios, but don't expect to get more than 1900mhz on IF. Adding 2 more sticks of memory will do wonders since it effectively changes your config from single-rank, dual channel memory to dual-rank, dual channel memory. This is what I get with 4 sticks of that kit with the same OC.








I scored 18 147 in Time Spy


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

leegoocrap said:


> it's an EVGA xc3 (painted shunts) OC wasn't at max, but it's hot today and didn't want to deal with opening the case up and pointing a fan at it
> 
> 5800x PBO (4x scalar +200 max)
> Viper 4000 B-die, I'm about 70% tuned stable... finishing secondaries about to start on terts... have been wasting a bunch of time trying to get 3800/1900fclk on this cpu but so far has eluded me 1867 - perfect. 1900 - worked a fair bit just to get it to post, but even then immediately in windows crackling speakers. Stinks my old 3600 (og) was a champ at 3800/1900.


Make sure you have the latest bios, I wasn't able to run 1800mhz IF until I updated to the latest and greatest agesa a couple weeks ago. Also download hwinfo64 and scroll all the way to the bottom where it says "windows hardware errors," those are caused by unstable infinity fabric. Best to run timespy, cinebench, or prime95 mem stress with that open for 20 min and make sure WHEA stays at 0.

Sent from my IN2025 using Tapatalk


----------



## mouacyk

Colonel_Klinck said:


> So while I was changing the shunts I checked the standoffs and 3 weren't tightened right down. As suggested added thermal pads to the back of the die and used the thermal pads off the OEM backplate on the EK backplate for the memory. Temps went after an hour of gaming from 50c core to 44c core and memory from 70c to 65c. My fans aren't ramped up on MB bios I use for gaming due to noise.


Yep, had same issue on my Bykski block and got significant temp improvements after tightening them all.


----------



## Dylanshock

DaftConspiracy said:


> That's an IF limit you're hitting. Make sure you have the latest bios, but don't expect to get more than 1900mhz on IF. Adding 2 more sticks of memory will do wonders since it effectively changes your config from single-rank, dual channel memory to dual-rank, dual channel memory. This is what I get with 4 sticks of that kit with the same OC.
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 147 in Time Spy
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Sent from my IN2025 using Tapatalk


i updated the aorus x570i bios two days ago..I noticed that it was easy to raise infinity and make it stable at 1900mhz..Unfortunately my itx card only has two ram banks..if I change to a Matx i will add two more ram banks. .there is a big performance boost? I would like to shunt on the paper but I'm afraid for the guarantee .. I read that you did with the paint but what does it consist of?


----------



## Dylanshock

i was looking for a new matx but there are only b550m and not x570i..what would be the best b550m on the market with excellent overclocking qualities?


----------



## Colonel_Klinck

DaftConspiracy said:


> Interesting, this is the first I've heard of a standoff issue with an EK product, the ones I check on mine were pretty tight. Good to hear the backplate thermal pad tricked for you too, what card do you use? I'm going to email ek and suggest they add additional pads for the 3080s. Not including them is just lazy, it's a $50-$70 aluminum plate. Not like it'll hurt their profit margins.
> 
> Sent from my IN2025 using Tapatalk


ASUS TUF OC. 3 standoffs took a 1/5 turn each


----------



## Hirtle

DaftConspiracy said:


> Interesting, this is the first I've heard of a standoff issue with an EK product, the ones I check on mine were pretty tight. Good to hear the backplate thermal pad tricked for you too, what card do you use? I'm going to email ek and suggest they add additional pads for the 3080s. Not including them is just lazy, it's a $50-$70 aluminum plate. Not like it'll hurt their profit margins.
> 
> Sent from my IN2025 using Tapatalk


Good luck trying to get anything out of EK. I've been talking to them for the past few weeks about the issue with their back plates for 30 series cards. I explained to them how the product didn't include the amount of thermal pads the instructions said it should. They told me they would send me some more as long as I pay for shipping. After I've already paid for them?


----------



## leegoocrap

DaftConspiracy said:


> Make sure you have the latest bios, I wasn't able to run 1800mhz IF until I updated to the latest and greatest agesa a couple weeks ago. Also download hwinfo64 and scroll all the way to the bottom where it says "windows hardware errors," those are caused by unstable infinity fabric. Best to run timespy, cinebench, or prime95 mem stress with that open for 20 min and make sure WHEA stays at 0.
> 
> Sent from my IN2025 using Tapatalk


yeah, latest bios/everything... rock solid below 1900, but no need to even test stability at 1900 if sound is already cracking in the speakers. Some other folks seem to be having the issue on Aorus x570 boards... but without knowing everyone's settings/etc it's hard to differentiate a trend from a couple of people with bad bins.


----------



## leegoocrap

Dylanshock said:


> i updated the aorus x570i bios two days ago..I noticed that it was easy to raise infinity and make it stable at 1900mhz..Unfortunately my itx card only has two ram banks..if I change to a Matx i will add two more ram banks. .there is a big performance boost? I would like to shunt on the paper but I'm afraid for the guarantee .. I read that you did with the paint but what does it consist of?


you have to bare minimum scrape the coating off of the edges of the shunts to do the paint mod, so while the paint itself isn't permanent, you are modding the card (for warranty purposes) - whether or not the manu would notice it... hard to say... also brings up an ethical question... but that's up to each person to decide in my mind...

You can look through the last few pages of the easy shunt mod thread for my saga with it. Was it worth it to do... in a vacuum, no... the warranty is worth more than the gains. It makes more sense to sell the card you've got on ebay and buy a 3 pin / better card, especially with the inflated prices. If you just really like to tinker with stuff, see what you can get out of it (and you've come to terms with the possibility that you _could_ ruin your card) it's a less intimidating entry than going straight to soldering, although honestly I will probably move on to soldering shunts on in the next little bit as it's tough to control exactly what results you get just lathering on paint and hoping for the best.


----------



## DaftConspiracy

Dylanshock said:


> i updated the aorus x570i bios two days ago..I noticed that it was easy to raise infinity and make it stable at 1900mhz..Unfortunately my itx card only has two ram banks..if I change to a Matx i will add two more ram banks. .there is a big performance boost? I would like to shunt on the paper but I'm afraid for the guarantee .. I read that you did with the paint but what does it consist of?


That's unfortunate. Gamers nexus and hardware unboxed have good videos benchmarking the difference. I saw a huge difference in timespy going from 2 sticks to 4, I think it was 500 points or so in the CPU score.



leegoocrap said:


> yeah, latest bios/everything... rock solid below 1900, but no need to even test stability at 1900 if sound is already cracking in the speakers. Some other folks seem to be having the issue on Aorus x570 boards... but without knowing everyone's settings/etc it's hard to differentiate a trend from a couple of people with bad bins.


I have the cheaper (gigabyte) version of that board and with the new bios I can run IF at 1800mhz. The difference between 1800mhz and 1733mhz isn't huge though as long as you tune timings.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

leegoocrap said:


> you have to bare minimum scrape the coating off of the edges of the shunts to do the paint mod, so while the paint itself isn't permanent, you are modding the card (for warranty purposes) - whether or not the manu would notice it... hard to say... also brings up an ethical question... but that's up to each person to decide in my mind...
> 
> You can look through the last few pages of the easy shunt mod thread for my saga with it. Was it worth it to do... in a vacuum, no... the warranty is worth more than the gains. It makes more sense to sell the card you've got on ebay and buy a 3 pin / better card, especially with the inflated prices. If you just really like to tinker with stuff, see what you can get out of it (and you've come to terms with the possibility that you _could_ ruin your card) it's a less intimidating entry than going straight to soldering, although honestly I will probably move on to soldering shunts on in the next little bit as it's tough to control exactly what results you get just lathering on paint and hoping for the best.


If you're unsure if recommend gluing resistors on, gives pretty consistent results from what I've heard and it's reversible. I soldered mine, but I'm very proficient with a soldering iron and think I could make it look like it hasn't been tampered with if I put the effort in. Id say results were absolutely worth it. Went from bouncing 1860mhz-2000mhz with very inconsistent framerate to 2130mhz locked and minimum frames extremely close to average frames.

Sent from my IN2025 using Tapatalk


----------



## obscurehifi

DaftConspiracy said:


> The reason those areas are so hot is because those components are sinking heat to that thin piece of copper that is covered in a plastic shroud, holding the heat in. The die area is cooler because the actual coldplate makes contact with it. They're relying on poor contact between that copper plate and the cold plate to cool the memory and vrm by the looks of it. That's a really poor design.
> 
> Sent from my IN2025 using Tapatalk


I can't really comment if the copper plate / cold plate is a bad design for the Aorus 3080 Waterforce or not but the previous plots I showed with 102C memory junction temp was with Furmark and that memory error tester running. That definitely seems to be worse case. I have found in gaming and timespy, the temperature seems to max out at 88C, not 102C.

Here's a timespy loop from this morning showing this. I started at default settings then changed memory to +600 then +1200 then back to +0. I also changed the core clock from +0 to +90 to -90 to +90 then +0. The fans were on auto the whole time except for when I changed it to 100% (green) in which both temperatures dropped. Interestingly, memory overclocking seems to have zero effect on the memory temp. My room with the door closed seems to hit 21 to 27C for reference.









Here's a two hour session of TitanFall2 running at +0 on both mem and core clocks. This is running Ultra settings at 1440p and pretty much locked in at 240Hz (amazing experience). This is also with fans on auto. With a throttle temp of 110C on the memory, this really doesn't seem that bad. I'm sure it can be better but the Furmark / mem error tests definitely seem worse case and might be more relevant to miners than gamers.









Here's one hour in Dishonored 2. I would have ran longer but I finally completed the game.









Anyways, this card has been an extremely great gaming card for long sessions and the core clock typical stays around 2000Mhz during gaming sessions, even without any overclocking. Here's the gpu core clock from the same gaming sessions above without any overclocking. Not sure how this compares with other cards but it sure seems to boost to that 2000Mhz mark all by itself and maintain it for hours while staying around 56 to 58C on the gpu and fans on auto. Except for benchmarks, there's not much reason to overclock this 370W card.


----------



## ssgwright

my memory junction temp max is 52c but it doesn't affect stability, max I can go without losing performance is +800


----------



## max883

Silent and cool.. Great performance


----------



## mouacyk

max883 said:


> Silent and cool.. Great performance


How great?


----------



## max883

Undervolt 0.900 1950.mhz gpu and 10000.mem

Noctua 120mm x2 Fans at 50% 1100.rpm 58.c


----------



## mouacyk

max883 said:


> Undervolt 0.900 1950.mhz gpu and 10000.mem
> 
> Noctua 120mm x2 Fans at 50% 1100.rpm 58.c


Not bad at all for a simple mod. What is the memory junction temp?


----------



## obscurehifi

max883 said:


> Undervolt 0.900 1950.mhz gpu and 10000.mem
> 
> Noctua 120mm x2 Fans at 50% 1100.rpm 58.c


10000 mem? 

Sent from my SM-G973U using Tapatalk


----------



## EarlZ

DaftConspiracy said:


> Without shunt mods you won't be anywhere near the limits of the 8 pin connections
> 
> Sent from my IN2025 using Tapatalk


I've seen a lot of folks post that they are limited by the dual 8p connectors on their cards as it only provides 150watts each with out mods, I know this is OCN and nobody here runs stock configs, I would love to shut mod my gpu however with the very limited availability of the card I would prefer to play it safe for now. This is why I am asking if the 3rd 8p is functional or its just like the MSI gaming X trio that the VRM is not up to scratch.


----------



## Falkentyne

EarlZ said:


> I've seen a lot of folks post that they are limited by the dual 8p connectors on their cards as it only provides 150watts each with out mods, I know this is OCN and nobody here runs stock configs, I would love to shut mod my gpu however with the very limited availability of the card I would prefer to play it safe for now. This is why I am asking if the 3rd 8p is functional or its just like the MSI gaming X trio that the VRM is not up to scratch.


The third 8 pin is functional, yes, but you are going to run into the output rail power limit long before you reach the limits of the VRM.
The output rail power limit (GPU Core NVVDD Output Power (sum)) seems to ignore shunt mods and just goes way up until it hits a limit, and this limit isn't even shown on the Ampere Bios Editor.
If it's the "Sum" value itself, it seems to trigger between 336W-400W on 3090 cards. If it's the unsummed value (meaning GPU Core NVVDD Output Power--the one RIGHT above SRAM Output Power), seems to be around 300W.

My guess is that this is basically the "Output" version of GPU Chip Power. The one we're shunting with shunt mods is the _INPUT_ version. The Input version also seems to be the one shown in the Ampere Bios editor.


----------



## DaftConspiracy

EarlZ said:


> I've seen a lot of folks post that they are limited by the dual 8p connectors on their cards as it only provides 150watts each with out mods, I know this is OCN and nobody here runs stock configs, I would love to shut mod my gpu however with the very limited availability of the card I would prefer to play it safe for now. This is why I am asking if the 3rd 8p is functional or its just like the MSI gaming X trio that the VRM is not up to scratch.


A 3x8 pin card will bring your hard limit from 350w to 450w total board consumption (if you flash a 450w bios or get a strix). That will be enough for any game you throw at it and almost enough to completely avoid the limited in time spy. Just avoid the MSI cards since there seems to be some design issues with those. Any other brand with have VRMs more than capable of running 500w+ even if they're designed for 2x8 pins.

Sent from my IN2025 using Tapatalk


----------



## mouacyk

Anyone else gaming at 22000MHz on GDDR6X? Is there anyway to go higher, MSI AB seems to have topped out at +1500.


----------



## SoldierRBT

@mouacyk 

Precision X1 let you go higher than +1500 on memory.


----------



## obscurehifi

mouacyk said:


> Anyone else gaming at 22000MHz on GDDR6X? Is there anyway to go higher, MSI AB seems to have topped out at +1500.


Not sure if it works on all brands but Aorus Engine goes higher. It's double though, so 3000 means 1500. You can raise it a ways above 3000. 19000 is default, so technically 22000 is +3000 anyways. 

Sent from my SM-G973U using Tapatalk


----------



## obscurehifi

mouacyk said:


> Anyone else gaming at 22000MHz on GDDR6X? Is there anyway to go higher, MSI AB seems to have topped out at +1500.


Also, if you're going that high on memory, you should check out the ampere memory error checker to see where your card's limit is. Just because it can go higher, doesn't necessarily mean there is a benefit since I guess the card will show down to correct the errors. 

Sent from my SM-G973U using Tapatalk


----------



## mouacyk

SoldierRBT said:


> @mouacyk
> 
> Precision X1 let you go higher than +1500 on memory.


Wow works so much better than AB. It really locks the clock and does not bounce around like AB.


----------



## leegoocrap

DaftConspiracy said:


> If you're unsure if recommend gluing resistors on, gives pretty consistent results from what I've heard and it's reversible. I soldered mine, but I'm very proficient with a soldering iron and think I could make it look like it hasn't been tampered with if I put the effort in. Id say results were absolutely worth it. Went from bouncing 1860mhz-2000mhz with very inconsistent framerate to 2130mhz locked and minimum frames extremely close to average frames.
> 
> Sent from my IN2025 using Tapatalk


I would have thought gluing would be a bit worrisome, heat has a chance of melting it, no?
Right now just a bunch of layers of 842ar on my card... definitely saw some good gains, but it's a little all over the place (hard to nail down exact layering on each shunt) 
I'm fairly handy with a soldering iron, but of course my hands steadiness is in direct contrast with the cost of what I'm soldering  

I may go back and either try the paint+resistor method, or solder them on... on the one hand, why not... but on the other hand, another couple of fps in Cyberpunk 1440p isn't overly important


----------



## Falkentyne

leegoocrap said:


> I would have thought gluing would be a bit worrisome, heat has a chance of melting it, no?
> Right now just a bunch of layers of 842ar on my card... definitely saw some good gains, but it's a little all over the place (hard to nail down exact layering on each shunt)
> I'm fairly handy with a soldering iron, but of course my hands steadiness is in direct contrast with the cost of what I'm soldering
> 
> I may go back and either try the paint+resistor method, or solder them on... on the one hand, why not... but on the other hand, another couple of fps in Cyberpunk 1440p isn't overly important


Hard to not get the shakes but its much more shakes on a $1600 video card than a $800 one!
Why do you want to solder? Are you hitting a power limit?


----------



## leegoocrap

Falkentyne said:


> Hard to not get the shakes but its much more shakes on a $1600 video card than a $800 one!
> Why do you want to solder? Are you hitting a power limit?


Seems that way, still running into power perfcap on higher overclocks... but really that's unimportant other than Port Royal scores... and even now it's getting to the point where much more juice will start overpowering the 240aio anyways. Also the fuses on the xc3 will (eventually) become a hurdle...

More than anything it seems I've got a sickness for taking things apart and tinkering with them... having chopped up a few (very expensive) carbon bicycles and race car bodies in the past to "make them better" it seems it's spilled over into my gaming rig as well  working with composites is a little easier than precision electronics though...


----------



## Falkentyne

leegoocrap said:


> Seems that way, still running into power perfcap on higher overclocks... but really that's unimportant other than Port Royal scores... and even now it's getting to the point where much more juice will start overpowering the 240aio anyways. Also the fuses on the xc3 will (eventually) become a hurdle...
> 
> More than anything it seems I've got a sickness for taking things apart and tinkering with them... having chopped up a few (very expensive) carbon bicycles and race car bodies in the past to "make them better" it seems it's spilled over into my gaming rig as well  working with composites is a little easier than precision electronics though...


Can you post a hwinfo screenshot with the rail values visible and the TDP Normalized % Was normalized higher than TDP%? What were you hitting the power limit on?


----------



## mouacyk

obscurehifi said:


> Also, if you're going that high on memory, you should check out the ampere memory error checker to see where your card's limit is. Just because it can go higher, doesn't necessarily mean there is a benefit since I guess the card will show down to correct the errors.
> 
> Sent from my SM-G973U using Tapatalk


At +1600 for 22,208MHz! Doing the Heaven benchmark check as initial verification, then ran Ampere Memory Check #3:


Spoiler: Results



Test Results 
Test1 0 errors 
Test2 0 errors 
Test3 0 errors 
Test4 0 errors 
Test5 0 errors 
Test6 0 errors 
Test7 0 errors 
Test8 0 errors 
Test9 0 errors 
Test10 0 errors 
Test11 0 errors 
Test12 0 errors 
Test13 0 errors 
Test14 0 errors 
Test15 0 errors 
Test16 0 errors 
Test17 0 errors 
Test18 0 errors 
Test19 0 errors 
Test20 0 errors 
Test21 0 errors 
Test22 0 errors 
Test23 0 errors 
Test24 0 errors 
Test25 0 errors 
Test26 0 errors 
Test27 0 errors 
Test28 0 errors 
Test29 0 errors 
Test30 0 errors 
Test31 0 errors 
Test32 0 errors 
Test33 0 errors 
Test34 0 errors 
Test35 0 errors 
Test36 0 errors 
Test37 no error 
Test38 no error 
Test39 0 errors 
Test40 no error


----------



## obscurehifi

mouacyk said:


> At +1600 for 22,208MHz! Doing the Heaven benchmark check as initial verification, then ran Ampere Memory Check #3:
> 
> 
> Spoiler: Results
> 
> 
> 
> Test Results
> Test1 0 errors
> Test2 0 errors
> Test3 0 errors
> Test4 0 errors
> Test5 0 errors
> Test6 0 errors
> Test7 0 errors
> Test8 0 errors
> Test9 0 errors
> Test10 0 errors
> Test11 0 errors
> Test12 0 errors
> Test13 0 errors
> Test14 0 errors
> Test15 0 errors
> Test16 0 errors
> Test17 0 errors
> Test18 0 errors
> Test19 0 errors
> Test20 0 errors
> Test21 0 errors
> Test22 0 errors
> Test23 0 errors
> Test24 0 errors
> Test25 0 errors
> Test26 0 errors
> Test27 0 errors
> Test28 0 errors
> Test29 0 errors
> Test30 0 errors
> Test31 0 errors
> Test32 0 errors
> Test33 0 errors
> Test34 0 errors
> Test35 0 errors
> Test36 0 errors
> Test37 no error
> Test38 no error
> Test39 0 errors
> Test40 no error


That looks great! One thing I did was run furmark 1440p at the SAME time as the memory error checker using the 1initial.bat, or whatever the first bat if called. That gets everything nice and hot. 

Sent from my SM-G973U using Tapatalk


----------



## leegoocrap

Falkentyne said:


> Can you post a hwinfo screenshot with the rail values visible and the TDP Normalized % Was normalized higher than TDP%? What were you hitting the power limit on?











not the absolute max I have been able to get to finish PR on, but this is about as "stable" as I can go and reliably finish the bench. (+150/750)








I scored 12 460 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Falkentyne

leegoocrap said:


> View attachment 2476756
> 
> not the absolute max I have been able to get to finish PR on, but this is about as "stable" as I can go and reliably finish the bench. (+150/750)
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 460 in Port Royal
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


Found the problem.
You need to re-paint your 8 pin #2 shunt. Make sure you get proper contact and make sure it's very well scraped first!
Remember what I told you before.
Use 3M high temp polymide tape (very high quality) or 3M Super 33+ tape completely around the shunt and make sure you -rub- the paint onto the conductive edges of the shunt so it works its way in, then bridge the edges. Taping around the shunt will allow you more safety in working in the paint without accidentally getting paint on the PCB.

You're hitting the individual rail power limit (150W), which is equal to the SRC power limit (the rails and SRC are linked).


----------



## leegoocrap

Falkentyne said:


> You're hitting the individual rail power limit (150W), which is equal to the SRC power limit (the rails and SRC are linked).


Thanks, I'll pull it out and give it another try soon. Pin 2 shunt should be the same as it is on BMGjets 3090 picture I suppose.


----------



## ssgwright

weird, I can run PR at 2,215(avg) but my score is lower than 2,190... I was using chilled water before and my max temp was 39c compared to my normal run at 2,215 max temp 47c


----------



## ssgwright

you'll notice on my 2190 run my mem mhz was lower so I thought ok maybe I'm losing performance because my mem is overclocked too high, but that wasn't the case I dropped my mem mhz but just got worse performance


----------



## chiknnwatrmln

DaftConspiracy said:


> A 3x8 pin card will bring your hard limit from 350w to 450w total board consumption (if you flash a 450w bios or get a strix). That will be enough for any game you throw at it and almost enough to completely avoid the limited in time spy. Just avoid the MSI cards since there seems to be some design issues with those. Any other brand with have VRMs more than capable of running 500w+ even if they're designed for 2x8 pins.
> 
> Sent from my IN2025 using Tapatalk


Can you elaborate on the design issues with MSI cards?


----------



## DaftConspiracy

ssgwright said:


> weird, I can run PR at 2,215(avg) but my score is lower than 2,190... I was using chilled water before and my max temp was 39c compared to my normal run at 2,215 max temp 47c
> View attachment 2476773


Port Royal is weird, I can't seem to get consistent results with it and have pretty much given up on it. I can't get the same scores I was before even with the exact same settings.



chiknnwatrmln said:


> Can you elaborate on the design issues with MSI cards?


I'm not sure what they are really, I just know they had a recall on their supreme x card and that it was performing far below their trio (2x3 pin). MSI has been dropping the ball with a few of their graphics card designs these past couple generations. Not to mention their shady company practices. Best to avoid them.

Sent from my IN2025 using Tapatalk


----------



## ssgwright

the only thing I can think of that changed is maybe the nvidia drivers?


----------



## Falkentyne

DaftConspiracy said:


> Port Royal is weird, I can't seem to get consistent results with it and have pretty much given up on it. I can't get the same scores I was before even with the exact same settings.
> 
> 
> 
> I'm not sure what they are really, I just know they had a recall on their supreme x card and that it was performing far below their trio (2x3 pin). MSI has been dropping the ball with a few of their graphics card designs these past couple generations. Not to mention their shady company practices. Best to avoid them.
> 
> Sent from my IN2025 using Tapatalk





ssgwright said:


> the only thing I can think of that changed is maybe the nvidia drivers?


PR scores can vary as much as 400 points. Usually the fix is to exit 3dmark and then run it again until it 'fixes' the scores. If you are clever, you will know if its going to score well by looking at the first 4 seconds of the bench FPS.


----------



## Imprezzion

Well, if this deal does actually go through I should have a bnib Gigabyte Gaming OC 3080 by Wednesday evening. Picking it up in person.

There's not that many reviews of that specific model, it is listed here in the OP as custom PCB 370w power limit. What PCB does it use? Is it shared by other Gigabyte models? And what are the must-do mods on it? BIOS flash, cooling mods, which full cover blocks fit it if I do decide to get a full cover block for it?


----------



## obscurehifi

Imprezzion said:


> Well, if this deal does actually go through I should have a bnib Gigabyte Gaming OC 3080 by Wednesday evening. Picking it up in person.
> 
> There's not that many reviews of that specific model, it is listed here in the OP as custom PCB 370w power limit. What PCB does it use? Is it shared by other Gigabyte models? And what are the must-do mods on it? BIOS flash, cooling mods, which full cover blocks fit it if I do decide to get a full cover block for it?


This guy reviews the specs and internal components from reviews and pictures he found if over 50 3080's. He places that card in the second tier of five I think. Might be worth watching.






Sent from my SM-G973U using Tapatalk


----------



## mouacyk

Imprezzion said:


> Well, if this deal does actually go through I should have a bnib Gigabyte Gaming OC 3080 by Wednesday evening. Picking it up in person.
> 
> There's not that many reviews of that specific model, it is listed here in the OP as custom PCB 370w power limit. What PCB does it use? Is it shared by other Gigabyte models? And what are the must-do mods on it? BIOS flash, cooling mods, which full cover blocks fit it if I do decide to get a full cover block for it?


VGA Bios Collection | TechPowerUp 
PCB-wise, it seems to be identical to the Eagle, Eagle OC, and Vision cards. The Eagles are lower GPU bins, but the Vision seems to match it and comes in a white shroud. All of these use Gigabyte's custom flat 2x8-pin connectors and need to use extensions to connect to the PSU.

Byski makes a block (GIGABYTE RTX 3080 3090 GAMING 3X / EAGLE / VISION OC) that fits all these models and their 3090 counterparts. I have the same block on my Eagle OC, but make sure you tightened all the standoffs otherwise you will not get optimal temperatures. You can see my results in this post, with contact paper.

You can also checkout a better user review of the block and installation at HardwareLuxx. I actually followed the instructions there and installed the extra pads, but I got better temps by tightening a few loose standoffs on my block, which that user didn't mention at all. Good luck.


----------



## Muqeshem




----------



## mouacyk

Muqeshem said:


> Spoiler
> 
> 
> 
> 
> View attachment 2476831


Vahalla? Is there a standalone bench -- don't really want to buy a clunky game to just run benchmark.


----------



## Muqeshem

mouacyk said:


> Vahalla? Is there a standalone bench -- don't really want to buy a clunky game to just run benchmark.


yes it is gentoo user.
I think I do better than the average rtx 3080 here.


----------



## Colonel_Klinck

Falkentyne said:


> Found the problem.
> You need to re-paint your 8 pin #2 shunt. Make sure you get proper contact and make sure it's very well scraped first!


Is there a way of telling which is #1 and #2 8 pin?


----------



## flyboy320

I have an Asus TUF 3080 and in an effort to reduce the memory temps I would like to replace the stock thermal pads. Has anyone measured the thickness of the various pads on the TUF? Gamer Nexus did a teardown video on this card and guessed at some of the thicknesses but wondering if anyone knows for sure?


----------



## Imprezzion

mouacyk said:


> VGA Bios Collection | TechPowerUp
> PCB-wise, it seems to be identical to the Eagle, Eagle OC, and Vision cards. The Eagles are lower GPU bins, but the Vision seems to match it and comes in a white shroud. All of these use Gigabyte's custom flat 2x8-pin connectors and need to use extensions to connect to the PSU.
> 
> Byski makes a block (GIGABYTE RTX 3080 3090 GAMING 3X / EAGLE / VISION OC) that fits all these models and their 3090 counterparts. I have the same block on my Eagle OC, but make sure you tightened all the standoffs otherwise you will not get optimal temperatures. You can see my results in this post, with contact paper.
> 
> You can also checkout a better user review of the block and installation at HardwareLuxx. I actually followed the instructions there and installed the extra pads, but I got better temps by tightening a few loose standoffs on my block, which that user didn't mention at all. Good luck.


Nice, I hoped it would share the Eagle / Vision PCB. EK doesn't plan to make a block for them but Byski is fine as well. I have to plan out my rads first anyway so it'll stay on air for the first few weeks.

Thanks for the information!


----------



## Falkentyne

Colonel_Klinck said:


> Is there a way of telling which is #1 and #2 8 pin?


Usually 8 pin #1 is to the left of 8 pin #2. That's how it is on eVGA, Strix and FE cards.


----------



## Colonel_Klinck

Falkentyne said:


> Usually 8 pin #1 is to the left of 8 pin #2. That's how it is on eVGA, Strix and FE cards.


Ok mine is the TUF so it should be the same. Just curious as #2 consistently pulls more power than #1. This is just running Heaven but you can see it pulls at least an extra 20w, if not more. 005 stacked for 5x shunts around 8 pins and 010 on PCI-E


----------



## Falkentyne

So your board is already shunted?

A 20W difference when soldered is probably from a weak solder joint or improper fluxing.
My "badly" soldered FE (used a crappy iron on the two 8 pins...now I have a MUCH better iron--a TS100, but the 8 pins are stable enough!) is 6-8W difference on 8 pins now at 550W (depending on what I'm running). Port Royal=6 watt difference, Heaven=8-9W difference.

Did you shunt everything or did you skip something? Your last post looks like you said "you shunted the two 8 pins and PCIE Slot power ONLY".

If you did NOT shunt the SRC shunt, this can throw off both power balancing and even load (causing lower performance than expected or even lower temps due to lower load).


----------



## Colonel_Klinck

Falkentyne said:


> So your board is already shunted?
> 
> A 20W difference when soldered is probably from a weak solder joint or improper fluxing.
> My "badly" soldered FE (used a crappy iron on the two 8 pins...now I have a MUCH better iron--a TS100, but the 8 pins are stable enough!) is 6-8W difference on 8 pins now at 550W (depending on what I'm running). Port Royal=6 watt difference, Heaven=8-9W difference.
> 
> Did you shunt everything or did you skip something? Your last post looks like you said "you shunted the two 8 pins and PCIE Slot power ONLY".
> 
> If you did NOT shunt the SRC shunt, this can throw off both power balancing and even load (causing lower performance than expected or even lower temps due to lower load).



Mine are applied with the MG silver paint pen. The 5 shunts around the 8 pins are 005 shunts stacked. The PCI 010 shunr stacked. All shunts have been done. I'll do the 2 shunts directly below the 8 pins again. 

Cheers


----------



## ssgwright

Colonel_Klinck said:


> Ok mine is the TUF so it should be the same. Just curious as #2 consistently pulls more power than #1. This is just running Heaven but you can see it pulls at least an extra 20w, if not more. 005 stacked for 5x shunts around 8 pins and 010 on PCI-E


double check your shunts I think one or two isn't making good contact, I wanna say its pin#2 here's mine:


----------



## Falkentyne

Colonel_Klinck said:


> Mine are applied with the MG silver paint pen. The 5 shunts around the 8 pins are 005 shunts stacked. The PCI 010 shunr stacked. All shunts have been done. I'll do the 2 shunts directly below the 8 pins again.
> 
> Cheers


Protip:
If you have too low power reporting on an 8 pin after using 842AR silver paint to attach shunts, you can make a bridge over the top of the new shunt with paint. This will reduce the resistance of that pin even more. But make sure it doesn't degrade with time. A lot of current goes through those shunts!

With the paint, if you don't have proper contact with the paint and the original shunt, it throws off the readings to the main shunt substantially.


----------



## Hirtle

I finally decided to shunt mod my card. I used the MG842AR to stack 5MO resistors on top of the stock shunts. I did not modify the PCIE slot shunt because I've seen several videos, including der8auer's video, that did not modify the PCIE slot shunt, yet the board still pulled more power. My total power draw has not changed from stock, and I'm not sure why. I'll attach a screenshot of GPU-Z while running Furmark.








Here's a run of Port Royal. The 13047 is before, the 13008 is after the shunt mod. It seems to be just slightly more stable at 2175 avg., vs. 2159 avg. before.
https://www.3dmark.com/compare/pr/790788/pr/841718


----------



## Falkentyne

Hirtle said:


> I finally decided to shunt mod my card. I used the MG842AR to stack 5MO resistors on top of the stock shunts. I did not modify the PCIE slot shunt because I've seen several videos, including der8auer's video, that did not modify the PCIE slot shunt, yet the board still pulled more power. My total power draw has not changed from stock, and I'm not sure why. I'll attach a screenshot of GPU-Z while running Furmark.
> View attachment 2476856
> 
> Here's a run of Port Royal. the 13047 is before, the 13008 is after the shunt mod. It seems to be just slightly more stable at 2175 avg., vs. 2159 avg. before.


What exact card/board is this?

Regardless of the board used, it is important to SCRAPE the edges of the shunts (the silver part) down to the shiny area, to remove the conformal coating. This isn't necessary when soldering because the soldering iron will melt the coating when you apply 300C+ temps to it. I do not know which boards use conformal coating (the Founder's edition does and it's highly likely the Gigabyte boards do also).

There is some difficulty in applying MG 842AR to certain shunts on certain boards. It is absolutely _ESSENTIAL_ that you insulate the PCB with either 3M Super 33+ tape, or 3M high temp polymide tape (this polymide tape is worth the price and will SAVE your butt when you move up into soldering or desoldering shunts) when dealing with the tricky shunts.


----------



## Hirtle

Falkentyne said:


> What exact card/board is this?
> 
> Regardless of the board used, it is important to SCRAPE the edges of the shunts (the silver part) down to the shiny area, to remove the conformal coating. This isn't necessary when soldering because the soldering iron will melt the coating when you apply 300C+ temps to it. I do not know which boards use conformal coating (the Founder's edition does and it's highly likely the Gigabyte boards do also).
> 
> There is some difficulty in applying MG 842AR to certain shunts on certain boards. It is absolutely _ESSENTIAL_ that you insulate the PCB with either 3M Super 33+ tape, or 3M high temp polymide tape (this polymide tape is worth the price and will SAVE your butt when you move up into soldering or desoldering shunts) when dealing with the tricky shunts.


It's a Strix. There is no conformal coating. I didn't use anything to protect the PCB around the shunts because there's nothing even close to them on the Strix besides the pins for the 8 pin PCIE connectors near the top. I still double checked the board to make sure none of the paint squeezed out to short anything else.


----------



## Falkentyne

Hirtle said:


> It's a Strix. There is no conformal coating. I didn't use anything to protect the PCB around the shunts because there's nothing even close to them on the Strix besides the pins for the 8 pin PCIE connectors near the top. I still double checked the board to make sure none of the paint squeezed out to short anything else.


Even if there is no conformal coating, you should still scrape. The coating isn't like some sort of glaze like you're thinking. If you take a screwdriver to the edge of the shunt and you notice a strange film coming off when you scrape, that's the conformal coating.

Did you try bridging the entire shunt first (edge to edge) with the paint before stacking?
Doing this (with about 3 coats total) should give you another 100 watts at the minimum just from the paint.
Then if you needed more, you could throw a 5 mOhm shunt on top of it with another layer.


----------



## Colonel_Klinck

Ok I'll


Hirtle said:


> It's a Strix. There is no conformal coating. I didn't use anything to protect the PCB around the shunts because there's nothing even close to them on the Strix besides the pins for the 8 pin PCIE connectors near the top. I still double checked the board to make sure none of the paint squeezed out to short anything else.



I've got the Asus TUF and there was defiantly a coating on the shunts. I would imagine they use the same shunts. Its clear but when you scrape it off the surface underneath is shinny.

The shunts I fitted also had a coating.


----------



## ducegt

Port Royal 12,934 https://www.3dmark.com/pr/840152

Gaming X Trio in an enclosed case with 74F ambient. 13K seems so close, but yet so far away.


----------



## Falkentyne

Hirtle said:


> It's a Strix. There is no conformal coating. I didn't use anything to protect the PCB around the shunts because there's nothing even close to them on the Strix besides the pins for the 8 pin PCIE connectors near the top. I still double checked the board to make sure none of the paint squeezed out to short anything else.





Colonel_Klinck said:


> Ok I'll
> 
> 
> 
> I've got the Asus TUF and there was defiantly a coating on the shunts. I would imagine they use the same shunts. Its clear but when you scrape it off the surface underneath is shinny.
> 
> The shunts I fitted also had a coating.


The Strix and the TUF have exactly the same shunts (flush, not depressed like Founder's Editions). If there is coating on the TUF, there's absolutely coating on the Strix. Just because you didn't see coating doesn't mean it's not there. It's not going to look like you varnished it in nail polish or MG Chemicals conformal coating or anything (e.g. liquid metal protection). It's just going to look dull on the silver.

Scraping it will show a layer clearly coming off. Be careful with scraping. I would take the time to put Super 33+ tape on the PCB before scraping.
Also if you already have paint on and you need to remove it for scraping, DEFINITELY insulate the PCB! You don't want to slip with a mini flat blade and scratch something!!

Note: Isopropyl (100% is best) really cuts through old paint, but then it gets on the PCB. 3M Super 33+ tape insulation REALLY helps here, and leaves a very small cleanup rather than a big one afterwards. (3M High temp polymide tape would also work, but alcohol would also make it lose adhesion slowly as it gets under the edges).


----------



## Hirtle

Well I feel silly now. The conformal coating must be super thin, I couldn't see it at all. I'll try it again tomorrow.


----------



## Falkentyne

Hirtle said:


> Well I feel silly now. The conformal coating must be super thin, I couldn't see it at all. I'll try it again tomorrow.


It is super thin and not obvious at all. You can't see it on any teardown video. The only way you know it's there is that the silver isn't shiny. It's dull and doesn't reflect light at all.
You can call it what you wish...conformal coating, oxide layer, I don't know. Just be really fricking careful when you're scraping, please. You want to do VERY VERY small strokes. The longer it takes you, the better. And please cover the shunt area with Super 33+ tape or some sort of tape protection. And make sure you clean off any paint flakes completely (the tape will save you a LOT of work in the end. It's worth having).

I tried scraping pre-packaged shunts the same way and didn't see any coating on them at all. While there was a very thin residue of something on the edges of the original shunts.

Note: I do NOT know if using an untinned hot soldering iron tip to melt the coating will work or not.


----------



## Felgor

Are these the package size required for RTX 3000 shunt mod?





__





Panasonic 5mΩ, 2512 (6432M) Thick Film SMD Resistor ±1% 1W - ERJM1WSF5M0U | RS







au.rs-online.com





Seemed closest so I ordered some.


----------



## mouacyk

My shunt resistors are here, but ^^ sounds so dangerous now...  I do have electrical tape and liquid version too, so will try to be as careful as possible in both scraping and application of added shunts.


----------



## Felgor

mouacyk said:


> My shunt resistors are here, but ^^ sounds so dangerous now...  I do have electrical tape and liquid version too, so will try to be as careful as possible in both scraping and application of added shunts.


Would you mind confirming the part number you have so I can make sure what I ordered is ok?


----------



## Falkentyne

mouacyk said:


> My shunt resistors are here, but ^^ sounds so dangerous now...  I do have electrical tape and liquid version too, so will try to be as careful as possible in both scraping and application of added shunts.


For painting shunts, use Super 33+ tape. This is the best vinyl tape on the market and excellent adhesion without risking any damage to the PCB. This will completely protect the PCB from accidental paint.

Every person worth his salt should have this tape. It is SO useful.

For soldering, use 3M high temp polymide tape. 1/4" works best because you can just cut the layers you want, since you want it to fit in the gaps of some shunt areas. While 3/4" may seem better, you would have to trim it anyway to get into some narrow spots.

This is the tape I used for protecting my PCB when soldering. You have no idea how good this tape is. It's worth the price over crappy cheap chinese tape.



Amazon.com


----------



## Pupkin_San

Hi guys! Recently bought an EVGA 3080 FTW3 Ultra, flashed the XOC Bios and the card still cannot draw more than 410W (saying that perfcap is Pwr) under load. In theory it must be capable of drawing 450W. Is this the hardware limitation i'm stuck into, and do i need the shunt mod? Or, maybe i should try to flash another Bios? Thanks!


----------



## Falkentyne

Pupkin_San said:


> Hi guys! Recently bought an EVGA 3080 FTW3 Ultra, flashed the XOC Bios and the card still cannot draw more than 410W (saying that perfcap is Pwr) under load. In theory it must be capable of drawing 450W. Is this the hardware limitation i'm stuck into, and do i need the shunt mod? Or, maybe i should try to flash another Bios? Thanks!


Hardware flaw.

You need to shunt mod the PCIE Slot shunt with a 10 mOhm shunt resistor, either stacked with solder on top of the 005 one, or you can paint it with MG 842AR silver paint without using another shunt, as the paint acts like its own 10-15 mOhm shunt in parallel. Painting requires that you scrape the edges of the silver part of the original shunt fully to remove the conformal coating (even if you don't see it--it's there).

If you don't want to mod the card, return it and get a Strix


----------



## Pupkin_San

Falkentyne said:


> Hardware flaw.
> 
> You need to shunt mod the PCIE Slot shunt with a 10 mOhm shunt resistor, either stacked with solder on top of the 005 one, or you can paint it with MG 842AR silver paint without using another shunt, as the paint acts like its own 10-15 mOhm shunt in parallel. Painting requires that you scrape the edges of the silver part of the original shunt fully to remove the conformal coating (even if you don't see it--it's there).
> 
> If you don't want to mod the card, return it and get a Strix


Either way voids the warranty, right? If yes, then it's soldering time  There's no going back 'cause i've already watercooled the card. I'd be grateful if you can share some pics of resistor in question. Thanks!

P.S. is there a way to know whether i won the silicon lottery or not before the mod? should i try to undervolt the card?


----------



## mouacyk

Felgor said:


> Would you mind confirming the part number you have so I can make sure what I ordered is ok?


Got mine from Newark quite cheaply. Shipping was the majority cost. See MFR# below... 15MU and 20MU


----------



## acoustic

Falkentyne said:


> Hardware flaw.
> 
> You need to shunt mod the PCIE Slot shunt with a 10 mOhm shunt resistor, either stacked with solder on top of the 005 one, or you can paint it with MG 842AR silver paint without using another shunt, as the paint acts like its own 10-15 mOhm shunt in parallel. Painting requires that you scrape the edges of the silver part of the original shunt fully to remove the conformal coating (even if you don't see it--it's there).
> 
> If you don't want to mod the card, return it and get a Strix


Has anyone figured out what's causing some cards to be 30-40w under the actual power limit? My FTW3 Ultra will hit 450w all day long, and in games like Metro Exodus with large transient spikes, I've seen upwards of 475-480watts while the card adjusts voltages to get back under the limit.


----------



## Imprezzion

I wonder.. would any of those higher limit BIOS work on a Gigabyte Gaming OC (eagle PCB)? It has dual BIOS so I can quite safely just try it but even if it does work software wise, how much of a profit power limit wise would it get me above the stock 370w limit if I don't shunt it.

I kinda don't wanna shunt it and hurt resale value if I do upgrade again later on to a 3080 Ti or 3090 or whatever.. I never did shunt my 2080 Ti either and that did fine well over 450w on a XOC BIOS.


----------



## VPII

Imprezzion said:


> I wonder.. would any of those higher limit BIOS work on a Gigabyte Gaming OC (eagle PCB)? It has dual BIOS so I can quite safely just try it but even if it does work software wise, how much of a profit power limit wise would it get me above the stock 370w limit if I don't shunt it.
> 
> I kinda don't wanna shunt it and hurt resale value if I do upgrade again later on to a 3080 Ti or 3090 or whatever.. I never did shunt my 2080 Ti either and that did fine well over 450w on a XOC BIOS.


370 watt is sort of the limit for 2 x 8pin PCie power connectors. You will not gain much if even from flashing with a 400 watt or more bios as those bioses use the 3 x 8pin PCie power connectors. As such if you do flash it, it will mirror your one PCie connector to make up for the third....


----------



## Imprezzion

VPII said:


> 370 watt is sort of the limit for 2 x 8pin PCie power connectors. You will not gain much if even from flashing with a 400 watt or more bios as those bioses use the 3 x 8pin PCie power connectors. As such if you do flash it, it will mirror your one PCie connector to make up for the third....


Yeah that doesn't sound like a good idea hehe. I know 375w is the theoretical limit but a 8 pin can pull a bit more then 150w pretty safely but I don't wanna risk it really. PSU and cables will handle it fine I guess but still. I run a Seasonic Prime Gold 1000w with Cablemod cables and I do run 2 separate 8 pin cables even tho they have 2 connectors for the cable so the cables themselves can do the full 300w.


----------



## mouacyk

Imprezzion said:


> Yeah that doesn't sound like a good idea hehe. I know 375w is the theoretical limit but a 8 pin can pull a bit more then 150w pretty safely but I don't wanna risk it really. PSU and cables will handle it fine I guess but still. I run a Seasonic Prime Gold 1000w with Cablemod cables and I do run 2 separate 8 pin cables even tho they have 2 connectors for the cable so the cables themselves can do the full 300w.


I don't think anyone intends to run 500W+ continuously. I hope to achieve something similar to the XOC BIOS on my 1080 TI, where GPU clocks are not penalized by surpassing the power limits for gaming loads. Probably only furmark and similar loads will push 500W+ so games should max around 400W with reasonable clocks.


----------



## Muqeshem

guys what are your result for Assassian's Creed Valhalla ? 

I got this result.


----------



## Muqeshem

also are those temps are ok for the wattage ?


----------



## Colonel_Klinck

Ok I did the 2x 8pin shunts again. Still not perfect but its now a 7 to 8w difference instead of 20w. Can't be bothered to drain the loop and strip it again today but will have another attempt maybe at the weekend. Next time I think I'll scape off all the old paint and start again. Maybe the difference is thickness of paint as I've just applied more on top.


----------



## Hirtle

I'm still having issues shunt modding my card. I pulled all the added shunts off, scraped the stock ones and the new ones, then reapplied with MG842AR. It looks like the mod worked for MVDDC and PWR_SRC, but not for the 8 pins. I think I'll abandon the idea of applying them with the MG842AR and solder them instead. Here's a comparison while running Furmark. The left side is after my attempt to shunt mod it and the right side is stock.


----------



## Falkentyne

Colonel_Klinck said:


> Ok I did the 2x 8pin shunts again. Still not perfect but its now a 7 to 8w difference instead of 20w. Can't be bothered to drain the loop and strip it again today but will have another attempt maybe at the weekend. Next time I think I'll scape off all the old paint and start again. Maybe the difference is thickness of paint as I've just applied more on top.


7W difference is about as much as can be expected from paint. I have 5W difference with solder stacking at 400W (Overwatch 4k), and 6-9W at 550W (TImespy, Port Royal, Heaven)
As long as it remains stable, you're good.
If you want permanent results, you need to solder.


----------



## Falkentyne

Hirtle said:


> I'm still having issues shunt modding my card. I pulled all the added shunts off, scraped the stock ones and the new ones, then reapplied with MG842AR. It looks like the mod worked for MVDDC and PWR_SRC, but not for the 8 pins. I think I'll abandon the idea of applying them with the MG842AR and solder them instead. Here's a comparison while running Furmark. The left side is after my attempt to shunt mod it and the right side is stock.
> 
> View attachment 2476993


Soldering is reliable.
Can you run Heaven benchmark however, and not furmark? Furmark is already throttled in the drivers.


----------



## Felgor

mouacyk said:


> Got mine from Newark quite cheaply. Shipping was the majority cost. See MFR# below... 15MU and 20MU
> View attachment 2476931


Thanks, exactly the same as I ordered in 3Mo and 5Mo.


----------



## mouacyk

@Hirtle Re-tracing your posts, it seems you ignored the slot shunt? That was why it didn't work for der8auer. He was still throttled on power balancing, and you would have been too. It's in the first post of the "Easy Shut Mod" thread, that all shunts need to modded to scale the power balancing in unison. 



Felgor said:


> Thanks, exactly the same as I ordered in 3Mo and 5Mo.


Are you replacing the shunts? Stacking those will definitely blow any fuses if your card has them.


----------



## Colonel_Klinck

Falkentyne said:


> 7W difference is about as much as can be expected from paint. I have 5W difference with solder stacking at 400W (Overwatch 4k), and 6-9W at 550W (TImespy, Port Royal, Heaven)
> As long as it remains stable, you're good.
> If you want permanent results, you need to solder.


Ah ok, that was running Heaven so that seems all good. Happy days!


----------



## Hirtle

mouacyk said:


> @Hirtle Re-tracing your posts, it seems you ignored the slot shunt? That was why it didn't work for der8auer. He was still throttled on power balancing, and you would have been too. It's in the first post of the "Easy Shut Mod" thread, that all shunts need to modded to scale the power balancing in unison.
> 
> 
> Are you replacing the shunts? Stacking those will definitely blow any fuses if your card has them.


I recall that in his video he saw lower reported power draw from the 8 pins before even touching the shunt. I guess it won't hurt to mod it as well since I've never seen it exceed 50 W. The only reason I wanted to avoid it was because theoretically it could allow the card to pull 150 W from the slot. Now I can't decide if I want to try soldering them or keep using the paint.


----------



## mouacyk

Hirtle said:


> I recall that in his video he saw lower reported power draw from the 8 pins before even touching the shunt. I guess it won't hurt to mod it as well since I've never seen it exceed 50 W. The only reason I wanted to avoid it was because theoretically it could allow the card to pull 150 W from the slot. Now I can't decide if I want to try soldering them or keep using the paint.


With Ampere power balancing, it's unfortunately impossible to limit the slot to a lower power level without hindering others. @bmgjet has a calculator that shows what shunt resistors to stack or replace for a given power limit you're targeting. His example targets of 525W and 500W seem relatively conservative for 2x8pins, with the former drawing up to 100W through the slot.

I just can't believe some of you are not doing all the research you can before attempting this mod lol. I've done all this research, yet with my luck, I'm still going to muck something up.


----------



## Falkentyne

Hirtle said:


> I recall that in his video he saw lower reported power draw from the 8 pins before even touching the shunt. I guess it won't hurt to mod it as well since I've never seen it exceed 50 W. The only reason I wanted to avoid it was because theoretically it could allow the card to pull 150 W from the slot. Now I can't decide if I want to try soldering them or keep using the paint.


Yes you can get lower power draw from the 8 pins, but then the balancing goes haywire and you wind up throttling anyways, either on a different rail that isn't shunted, that has continuity with the shunts you modded, on a rail that you did shunt that is linked to one you didn't, or the effective clocks throttle you, or just randomly hitting a power limit, or one of the 8 pins winds up skyrocketing because you didn't mod SRC (usually on 3x8 pin cards).

You're not going to pull 150W from the slot anyway. The MSVDD / NVDDD limits are going to throttle you long before you get close, unless you mod the 1206 sized shunts (this is still unconfirmed, the only thing confirmed so far is that the 1206 shunts directly control the MSVDD and NVVDD rails).


----------



## Micko

I extracted 3080 TUF BIOS updates from official Asus' site - TUF-RTX3080-O10G-GAMING BIOS & FIRMWARE | Graphics Cards | ASUS USA and compared them in Ampere BIOS Editor. Release notes aside, do any of these two bios upgrades seem better than the original one if we keep in mind that 375w PL is the main limiting factor of TUF performance ?










Another question is, is there any link where where i can read more about power tables presented above. I tried googling but no luck so far.


----------



## Falkentyne

Micko said:


> I extracted 3080 TUF BIOS updates from official Asus' site - TUF-RTX3080-O10G-GAMING BIOS & FIRMWARE | Graphics Cards | ASUS USA and compared them in Ampere BIOS Editor. Release notes aside, do any of these two bios upgrades seem better than the original one if we keep in mind that 375w PL is the main limiting factor of TUF performance ?
> 
> View attachment 2477015
> 
> 
> Another question is, is there any link where where i can read more about power tables presented above. I tried googling but no luck so far.


The third one has an atrocious VRAM Power limit. Avoid.


----------



## Felgor

mouacyk said:


> @Hirtle Re-tracing your posts, it seems you ignored the slot shunt? That was why it didn't work for der8auer. He was still throttled on power balancing, and you would have been too. It's in the first post of the "Easy Shut Mod" thread, that all shunts need to modded to scale the power balancing in unison.
> 
> 
> Are you replacing the shunts? Stacking those will definitely blow any fuses if your card has them.


Stacking, and they are 5 milli ohm. Did I miss something?

Same type as you have.


----------



## mouacyk

th3illusiveman said:


> whats this stuff about an "effective clock" not matching whats shown in afterburner and thus costing some performance? Saw somewhere that AB might be be reporting the actual frequency?


I don't think I've gotten a good answer to this either, but based on my observation -- it looks like a heuristic interpolation that takes into account the throttling. Some kind of changing average clock. This is not to say that the discrete GPU clock is wrong -- it's still correct that it runs at that clock for that instant.



Felgor said:


> Stacking, and they are 5 milli ohm. Did I miss something?
> 
> Same type as you have.


See this too: GitHub - bmgjet/ShutMod-Calculator: Work out what shunt values to use easily.
With a 15mOhm resistor stacked on the slot shunt, you're already pulling up to 100W through the PCIe slot. If you have a 10 amp fuse, you're within 24W of blowing it. Or, the 24pin power connector may have trouble delivering that much power through the slot on lower quality boards. Not sure if you reviewed the "Easy Shut Mod" thread, but the first post is worth a full read -- especially the update for 2x-8pins section, where 15mOhm or 20mOhm resisters are recommended.


----------



## Felgor

mouacyk said:


> I don't think I've gotten a good answer to this either, but based on my observation -- it looks like a heuristic interpolation that takes into account the throttling. Some kind of changing average clock. This is not to say that the discrete GPU clock is wrong -- it's still correct that it runs at that clock for that instant.
> 
> 
> See this too: GitHub - bmgjet/ShutMod-Calculator: Work out what shunt values to use easily.
> With a 15mOhm resistor stacked on the slot shunt, you're already pulling up to 100W through the PCIe slot. If you have a 10 amp fuse, you're within 24W of blowing it. Or, the 24pin power connector may have trouble delivering that much power through the slot on lower quality boards. Not sure if you reviewed the "Easy Shut Mod" thread, but the first post is worth a full read -- especially the update for 2x-8pins section, where 15mOhm or 20mOhm resisters are recommended.


Thanks, looking through that thread now. Seems I was using info from 3x 8 pin or cards with non reference fuses etc. I'll confirm the fuses on my card first and just use 15 to 20mOhm most likely.


----------



## ZealotKi11er

Falkentyne said:


> The third one has an atrocious VRAM Power limit. Avoid.


Maybe it helps with mining? The current limit makes 3080 run over 100C.


----------



## Hirtle

mouacyk said:


> With Ampere power balancing, it's unfortunately impossible to limit the slot to a lower power level without hindering others. @bmgjet has a calculator that shows what shunt resistors to stack or replace for a given power limit you're targeting. His example targets of 525W and 500W seem relatively conservative for 2x8pins, with the former drawing up to 100W through the slot.
> 
> I just can't believe some of you are not doing all the research you can before attempting this mod lol. I've done all this research, yet with my luck, I'm still going to muck something up.


I think it's unfair to say that I didn't do any research since I have, in fact, done plenty. Everything that I've seen indicated that the slot shunt didn't necessarily need to be modded.




Falkentyne said:


> Yes you can get lower power draw from the 8 pins, but then the balancing goes haywire and you wind up throttling anyways, either on a different rail that isn't shunted, that has continuity with the shunts you modded, on a rail that you did shunt that is linked to one you didn't, or the effective clocks throttle you, or just randomly hitting a power limit, or one of the 8 pins winds up skyrocketing because you didn't mod SRC (usually on 3x8 pin cards).
> 
> You're not going to pull 150W from the slot anyway. The MSVDD / NVDDD limits are going to throttle you long before you get close, unless you mod the 1206 sized shunts (this is still unconfirmed, the only thing confirmed so far is that the 1206 shunts directly control the MSVDD and NVVDD rails).


Falkentyne, from what I've seen, if the mod works correctly, all associated power rails should report lower power in hwinfo/GPUZ. I did mod the shunts for MSVDD and SRC and they are responding as expected per my screenshots. So they seem to be fine. I was wondering why I didn't see a drop in reported power on the 8 pins as I should expect from a properly working shunt mod.

Speaking of continuity, I checked for continuity between the top shunt on each stack and the 12V pins on the connectors when I had the card apart this morning. That makes me think the mod should absolutely be working. It seems that I do in fact need to mod the slot shunt as it seems there is some weird power balancing issues going on.


----------



## Falkentyne

Hirtle said:


> I think it's unfair to say that I didn't do any research since I have, in fact, done plenty. Everything that I've seen indicated that the slot shunt didn't necessarily need to be modded.
> 
> 
> 
> Falkentyne, from what I've seen, if the mod works correctly, all associated power rails should report lower power in hwinfo/GPUZ. I did mod the shunts for MSVDD and SRC and they are responding as expected per my screenshots. So they seem to be fine. I was wondering why I didn't see a drop in reported power on the 8 pins as I should expect from a properly working shunt mod.
> 
> Speaking of continuity, I checked for continuity between the top shunt on each stack and the 12V pins on the connectors when I had the card apart this morning. That makes me think the mod should absolutely be working. It seems that I do in fact need to mod the slot shunt as it seems there is some weird power balancing issues going on.


You didn't mod the slot shunt? Oh dear...


----------



## ssgwright

Falkentyne said:


> The third one has an atrocious VRAM Power limit. Avoid.


really? I'm getting better oc results on the newer one (on the core anyway)


----------



## Hirtle

To make it even better, I've had the card apart twice now and purposely didn't mod the slot shunt. 🤦‍♂️


----------



## Falkentyne

Hirtle said:


> To make it even better, I've had the card apart twice now and purposely didn't mod the slot shunt. 🤦‍♂️


Consider yourself lucky you don't have to mod this contraption to bypass the 600W "stealth" power rail limit, which we think is controlled by the small 1206 shunts. We know 100% now that these two shunts control MSVDD and NVVDD power rails, and these rails are not reported on hwinfo64.

Just look at those depressed silver edges (lower than the middle housing).


----------



## ssgwright

ssgwright said:


> really? I'm getting better oc results on the newer one (on the core anyway)


saw no difference in mem overclock between the two bios... however, I get a way better overclock on the core with the new bios, the card is able to hold a higher vcore with the new bios


----------



## theforcedk

After some hours of tweaking. 19006 - founders card, water cooled. Mini ITX Asus z490-i, ram clocked at 4100CL17 (from 3600CL16) with some tight timings. Regular system with daily apps installed, not a clean install.. may be some bloat running in the background. I can't go much higher than this: I scored 19 006 in Time Spy. Oh well, my goal was to break 19000.


----------



## Hirtle

Falkentyne said:


> Consider yourself lucky you don't have to mod this contraption to bypass the 600W "stealth" power rail limit, which we think is controlled by the small 1206 shunts. We know 100% now that these two shunts control MSVDD and NVVDD power rails, and these rails are not reported on hwinfo64.
> 
> Just look at those depressed silver edges (lower than the middle housing).


I see what you mean now, I didn't know you were talking about a FE card. That's one of the many reasons why I didn't want an FE card for this generation.


----------



## mouacyk

@Hirtle I think @Falkentyne may be cross-referencing between 3080 and 3090's too, so keep that in mind. He has a 3090 FE.


----------



## leegoocrap

Falkentyne said:


> Found the problem.
> You need to re-paint your 8 pin #2 shunt. Make sure you get proper contact and make sure it's very well scraped first!
> Remember what I told you before.
> Use 3M high temp polymide tape (very high quality) or 3M Super 33+ tape completely around the shunt and make sure you -rub- the paint onto the conductive edges of the shunt so it works its way in, then bridge the edges. Taping around the shunt will allow you more safety in working in the paint without accidentally getting paint on the PCB.
> 
> You're hitting the individual rail power limit (150W), which is equal to the SRC power limit (the rails and SRC are linked).


So... I re-did the shunt, but no change to pin #2 - still runs right into 150w... I scraped the crap out of the edges and applied paint pretty carefully and thick... seems pretty unlikely (although of course not impossible) that I didn't get good contact/coating twice. 
Are we sure that Pin#2 on an EVGA xc3 is the one on the outside? (from bmgjets picture, which is of course a 3090 not a 3080, but still - it's marked that pin 1 should have the fuse directly above it and be closer to the die, pin 2 should be on the outside of the card with the fuse directly beside it...right? Looking at gpu-z/hwinfo compared to before... it really looks like that outer shunt is pin1.


----------



## Falkentyne

leegoocrap said:


> So... I re-did the shunt, but no change to pin #2 - still runs right into 150w... I scraped the crap out of the edges and applied paint pretty carefully and thick... seems pretty unlikely (although of course not impossible) that I didn't get good contact/coating twice.
> Are we sure that Pin#2 on an EVGA xc3 is the one on the outside? (from bmgjets picture, which is of course a 3090 not a 3080, but still - it's marked that pin 1 should have the fuse directly above it and be closer to the die, pin 2 should be on the outside of the card with the fuse directly beside it...right? Looking at gpu-z/hwinfo compared to before... it really looks like that outer shunt is pin1.
> 
> View attachment 2477063


When someone took a multimeter to the board, they said pin 1 was the one at the very top, by itself and pin 2 and 3 were in the group of 4, at the top row.
But I don't have this board. Also:

Oh...please check the paint on the MVDDC and SRC shunts. Imbalanced 8 pins can happen if the SRC is not shunted properly.
Your MVDDC and SRC are both reading very high. But high SRC can also happen from an 8 pin not being shunted....


----------



## leegoocrap

Falkentyne said:


> When someone took a multimeter to the board, they said pin 1 was the one at the very top, by itself and pin 2 and 3 were in the group of 4, at the top row.
> But I don't have this board. Also:
> 
> Oh...please check the paint on the MVDDC and SRC shunts. Imbalanced 8 pins can happen if the SRC is not shunted properly.
> Your MVDDC and SRC are both reading very high. But high SRC can also happen from an 8 pin not being shunted....


I checked with a multimeter, the outside (looking at the board from the top down, the one on the right - in red) plug connector runs to the shunt that I re-painted. That's only helpful if we were sure that what gpu-z/hwinfo reads as pin2 is that connector though.








haha... it kind of sounds like I should just strip the board and start over. I can definitely speed run removing the hybrid cooler and backplate at this point.
Think I'd be better off with a layer of paint+ an actual resistor vs. just trying to paint? Might remove some element of chance.


----------



## Falkentyne

leegoocrap said:


> I checked with a multimeter, the outside (looking at the board from the top down, the one on the right - in red) plug connector runs to the shunt that I re-painted. That's only helpful if we were sure that what gpu-z/hwinfo reads as pin2 is that connector though.
> View attachment 2477073
> 
> haha... it kind of sounds like I should just strip the board and start over. I can definitely speed run removing the hybrid cooler and backplate at this point.
> Think I'd be better off with a layer of paint+ an actual resistor vs. just trying to paint? Might remove some element of chance.


I would check and start over. Did you check the SRC though?
I think the SRC is that one shunt at the bottom in that crowded area. I don't remember, though. I do know MVDDC and GPU Chip Power are the lower two in the group of 4. Check MVDDC and chip. Your MVDDC should be below 80W.


----------



## leegoocrap

Falkentyne said:


> I would check and start over. Did you check the SRC though?
> I think the SRC is that one shunt at the bottom in that crowded area. I don't remember, though. I do know MVDDC and GPU Chip Power are the lower two in the group of 4. Check MVDDC and chip. Your MVDDC should be below 80W.


didn't check SRC, was just focused on what (I at least thought) was Pin2. I did it Monday, so I would think it's dried, but may give it another day or so just to see if it cures any farther...

If pin1&2 MVDDC and GPU Chip are the 4 clumped together and PCI is on the back, SRC has to be the one by itself.

I think I might just strip all the paint off (maybe leave what on the pci-e slot, it looks ok as is I think?) and try the paint+5mohm resistors on the front 5.


----------



## mouacyk

I count six shunts on the Gigabyte Eagle OC PCB, are these the correct ones to mod?


Spoiler: pics


----------



## leegoocrap

^looks right


----------



## mouacyk

Thanks @leegoocrap. I think it's easier than the FE, from what I gathered.

Dug up something of @bmgjet from EVGA, regarding inductor cooling:


> When I got my XC3 block back in Nov EKWB instructions had you putting 1.5mm thermal pads on them but they didnt make contact with anything.
> Emailed both EVGA and EKWB support about it. EVGA emailed back first and said they are cooled from factory so *they should be cooled when using a waterblock since they will have reduced air flow. (Makes sense to me)*.
> EKWB reply was that I could return the block if I wasnt happy with them not making contact.
> I forwarded them the email from EVGA then they said they would update there install instructions and include 4.5mm thermal pads from now on.


Makes sense to me too, so I will be looking for metal shims to fill the gap and use the thinnest thermal pads on both sides. The mentioned 4.5mm and 6mm pads are just way too thick -- wonder if they actually do any good.


----------



## Imprezzion

Well, I finally got my 3080 guys! Just picked up the Gigabyte 3080 Gaming OC locally bnib and it works fine. Very beautiful card but my god that thing is long.

It is running great, very quiet even on 80% fanspeed, temps sat at flat 70c the entire time at 80% fanspeed with 1925-1960Mhz core but it is only drawing 340w. I thought this card had 370w limit.

EDIT: According to TPU there is a 345w BIOS as well.
Weird thing is, my BIOS version doesn't exist on TPU anywhere, for none of the 3080's..
94.02.42.40.33 is my version.. weird.


----------



## mouacyk

Imprezzion said:


> Well, I finally got my 3080 guys! Just picked up the Gigabyte 3080 Gaming OC locally bnib and it works fine. Very beautiful card but my god that thing is long.
> 
> It is running great, very quiet even on 80% fanspeed, temps sat at flat 70c the entire time at 80% fanspeed with 1925-1960Mhz core but it is only drawing 340w. I thought this card had 370w limit.
> 
> EDIT: According to TPU there is a 345w BIOS as well.
> Weird thing is, my BIOS version doesn't exist on TPU anywhere, for none of the 3080's..
> 94.02.42.40.33 is my version.. weird.


If you run furmark (or equally intensive load) and watch GPUz, you will see it actually slightly exceed 370W. As with all other cards, the practical limit seems to be around 350W. The 370W limit is not consistent -- which I think was done on purpose, in order for throttling to back off slowly towards 350W.


----------



## Felgor

Watercooling for the backplate.






MP5WORKS – Watercooling Components







mp5works.com


----------



## Imprezzion

mouacyk said:


> If you run furmark (or equally intensive load) and watch GPUz, you will see it actually slightly exceed 370W. As with all other cards, the practical limit seems to be around 350W. The 370W limit is not consistent -- which I think was done on purpose, in order for throttling to back off slowly towards 350W.


I've been running Division 2 on it for a bit and it hovers around 345-355w all the time yes. Around 1925 core @ 1.006v.

Just for testing I flashed one of the "older" Gaming OC BIOS on the "silent" slot which I don't intent to use and that had 370w as well and it's the exact same across the board so.

I raised the memory (micron) to +500, held it fine so far. +50 on the core as well. Usually sitting around 1965 now. How high can I expect on average?

I am going to watercool it with either the Kraken G12 + X52 I have laying around from my 2080 Ti or the Byski full cover block once I have a custom loop together parts wise not for the card per se but for the rest of the system temps. My memory is like 10c hotter now and B-Die hates high temps with a high OC so..


----------



## mouacyk

Imprezzion said:


> I've been running Division 2 on it for a bit and it hovers around 345-355w all the time yes. Around 1925 core @ 1.006v.
> 
> Just for testing I flashed one of the "older" Gaming OC BIOS on the "silent" slot which I don't intent to use and that had 370w as well and it's the exact same across the board so.
> 
> I raised the memory (micron) to +500, held it fine so far. +50 on the core as well. Usually sitting around 1965 now. How high can I expect on average?


I think the really good VRAM can hit excess of 22,000MHz (3GHz over stock). I seem to have passed 30 minutes of memory error checking at 22,200MHz but it crashes in random games (CryEngine) while 22,000MHz will hold in everything. Not sure if it helped, but I have thermal pads on backside of memory and extra copper heatsink drawing heat from backplate. Memory doesn't exceed 74C in games. I tried the NiceHash test miner and temps reach 90C -- don't know of anything else that pushes the memory harder.


----------



## EarlZ

Installed my Master 3080 Rev2 last night and I am very pleased with the card, its quite huge tho for my tiny case! 3rd 8PIN power doing its work and allowing my card to draw something like 440W+
I can see that the memory Tjunction goes up to 100C, Not sure how to improve that as I get those temps even with an 85% fan speed.


----------



## Imprezzion

mouacyk said:


> I think the really good VRAM can hit excess of 22,000MHz (3GHz over stock). I seem to have passed 30 minutes of memory error checking at 22,200MHz but it crashes in random games (CryEngine) while 22,000MHz will hold in everything. Not sure if it helped, but I have thermal pads on backside of memory and extra copper heatsink drawing heat from backplate. Memory doesn't exceed 74C in games. I tried the NiceHash test miner and temps reach 90C -- don't know of anything else that pushes the memory harder.


Holy... even on 100% fanspeed my memory junction temps are 82c in Division 2 @ 20000Mhz. Core is only at 60c.. This VRAM gets hot lol. I wonder if the Gigabyte has thermal pads under the backplate. I am considering just taking it apart for a sec even tho it's brand new just to see how the PCB layout is and if it's doable to use the Kraken X12+X52 combo. I have loads and loads of copper heatsinks laying around and probably some Sekisui thermal tape as well. Might as well repaste it with some proper paste while i'm at it.


----------



## Glottis

ssgwright said:


> saw no difference in mem overclock between the two bios... however, I get a way better overclock on the core with the new bios, the card is able to hold a higher vcore with the new bios


That comparison screenshot shows that November BIOS has increased power limits for 8pins and PCIE slot. Did you see higher power draw and better benchmark scores on the new BIOS? Currently my TUF is on October update (that's what it had out of the box). Not sure if it's worth updating?


----------



## ssgwright

Glottis said:


> That comparison screenshot shows that November BIOS has increased power limits for 8pins and PCIE slot. Did you see higher power draw and better benchmark scores on the new BIOS? Currently my TUF is on October update (that's what it had out of the box). Not sure if it's worth updating?


it is, saw big improvements in oc with the new bios


----------



## Imprezzion

Ok so, I stripped down the Gigabyte Gaming OC to see if the PCB layout allows mounting the Kraken G12 bracket, unfortunately it doesn't. Chokes interfere with the mount and the holes spacing is way off as well.

The stock paste application was surprisingly good. I repasted with PK-3, let's see the temps now.

So, next objective, aquire a Bykski Gaming OC 3080/3090 full cover block and make a custom loop for the card. Might as well add the CPU to that right away and go for a 420+280 rad combo with a D5 pump+res and such.. oh boy this is going to get expensive..

EDIT: Hmm, might be a bad mount.. temps with PK-3 aren't exactly amazing lol. About 4c higher for the core (64c) and memory junction temps are through the roof.. 92c in 3DMark.. should I be worried lol. Where on the card is the sensor for memory junction or is that just a part of the core / die like the AMD 5700XT's had problems with as well.

EDIT2: Nope, the backplate has no thermal pads or contact with the backside of the VRAM. I added some Arctic thick thermal pads to it and temps dropped about 6-8c.

Also re-did the PK3 application, much better now. It sits around 60c core 82 junction again like it used to.

Also ran a few 3DMark Time Spy runs to see how it performs and how high the memory still scales and I got a total score of 18168 (validated) at +105 core +1200 memory. +500 memory gave me 17840 so it definitely scales. I'll try some other memory clocks like +800, +1000, +1500 doesn't run. It seems to run fine for a while but starts to dip FPS hard after a few minutes and 3DMark isn't happy with it and completes the test with a "0" score with "an error occurred" so +1500 is a bit much to ask. Even if it does +1200 with proper scaling that feels incredibly high for running 80+ junction temps. 

The core maxed out at +105. If I run +120, the next bin up, it will boost to 2145Mhz under light loads which crashes occasionally. The card is easily capable of running 2115-2130 if it gets enough voltage but yeah, 350-355w limit means it never gets the full 1.100v but more like 1.006-1.037v at which 2100+ isn't very stable. It hovers around 2070-2025 which seems stable. 

I did also test undervolting, card is solid at 2010-1995Mhz at 0.987v and doesn't seem to throttle at all then.


----------



## cennis

acoustic said:


> Has anyone figured out what's causing some cards to be 30-40w under the actual power limit? My FTW3 Ultra will hit 450w all day long, and in games like Metro Exodus with large transient spikes, I've seen upwards of 475-480watts while the card adjusts voltages to get back under the limit.


I will be receiving a FTW3 shortly, can you let us know your OC settings? Do you also see ~450W in port royal or timespy just so we can have something consistent to compare against.


----------



## cennis

Pupkin_San said:


> Hi guys! Recently bought an EVGA 3080 FTW3 Ultra, flashed the XOC Bios and the card still cannot draw more than 410W (saying that perfcap is Pwr) under load. In theory it must be capable of drawing 450W. Is this the hardware limitation i'm stuck into, and do i need the shunt mod? Or, maybe i should try to flash another Bios? Thanks!


Have you tried the Strix bios? There were some reports that only 2x8pin were reading out the power draw correctly, which may trick the board to use higher power limit.


----------



## nyk20z3

If any one has a 3080 Strix OC for sale at MSRP let me know, cash in hand.


----------



## Imprezzion

So, there's no 2 x 8 pin BIOS that will magically make a 2x8pin card pull more then 355w? 

I mean, my card is a monster imo, it scales 3DMark scores all the way up to +1200 memory and core has no issues boosting well past 2100Mhz and staying stable around 2055-2070Mhz as low as 1.006v but I wanna have it power throttle a bit less, especially if I get my Bykski block for it. Which is going to be a problem because they are out of stock at Bykski themselves, there's no local resellers here, so I have to use eBay / Amazon / AliExpress worldwide.. that's going to get expensive quickly and take weeks..

Anyone know of a way to get a Bykski Gigabyte Gaming OC 3080/3090 block in Europe?


----------



## mouacyk

Instead of actively cooling the backplate, would customizing a metal shim that contacts the block and backplate help cool it better than just purely passive?








The shim in red is bent 180 to wrap around the bottom of the backplate. For more surface area, the shim may even be extended to contact the additional block areas circled in green. If it works, the block will probably get much hotter and increase GPU temp, since the backplate is quite hot after heating up.


----------



## acoustic

cennis said:


> I will be receiving a FTW3 shortly, can you let us know your OC settings? Do you also see ~450W in port royal or timespy just so we can have something consistent to compare against.


Yes I get full 450watt in every application if it's capable of drawing that much. I have no issues with being stuck underneath my power limit.

Here's my Port Royal and Timespy benches:

Port Royal - 12840: I scored 12 840 in Port Royal

TimeSpy - 18650: I scored 18 650 in Time Spy

FireStrike - 33524: I scored 33 524 in Fire Strike

I run +125/+650 for Port Royal, and then for TimeSpy and FireStrike results, I run at my 24/7 settings of +45/+500(might have been +550, I don't remember lol)

I have the 240mm EVGA Hybrid cooler on my FTW3.


----------



## Colonel_Klinck

Imprezzion said:


> So, there's no 2 x 8 pin BIOS that will magically make a 2x8pin card pull more then 355w?


That would be a no. Shunt mod is the way those of us with 2x 8 pin have got past the max 375w limit.


----------



## Falkentyne

Imprezzion said:


> So, there's no 2 x 8 pin BIOS that will magically make a 2x8pin card pull more then 355w?
> 
> I mean, my card is a monster imo, it scales 3DMark scores all the way up to +1200 memory and core has no issues boosting well past 2100Mhz and staying stable around 2055-2070Mhz as low as 1.006v but I wanna have it power throttle a bit less, especially if I get my Bykski block for it. Which is going to be a problem because they are out of stock at Bykski themselves, there's no local resellers here, so I have to use eBay / Amazon / AliExpress worldwide.. that's going to get expensive quickly and take weeks..
> 
> Anyone know of a way to get a Bykski Gigabyte Gaming OC 3080/3090 block in Europe?


No, you need to shunt mod the 6 (or 7) large shunts to bypass the main limits.
Then you need to identify what resistance the 1206 shunts are (usually 5 mOhm, but the Strix is 3 mOhm), as they control the MSVDD and NVVDD current rails. Those are harder to shunt. Stacking 5 mOhms on those will remove those limits (well, double them). Strix limits are high enough so you won't have to mod them.


----------



## mouacyk

Colonel_Klinck said:


> That would be a no. Shunt mod is the way those of us with 2x 8 pin have got past the max 375w limit.


There likely won't ever be a BIOS for 2x pins to surpass it either, for (1) warranty and (2) marketing reasons. If you request the 1000W BIOS, EVGA knows who you are.



Falkentyne said:


> No, you need to shunt mod the 6 (or 7) large shunts to bypass the main limits.
> Then you need to identify what resistance the 1206 shunts are (usually 5 mOhm, but the Strix is 3 mOhm), as they control the MSVDD and NVVDD current rails. Those are harder to shunt. Stacking 5 mOhms on those will remove those limits (well, double them). Strix limits are high enough so you won't have to mod them.


On the Gigabyte boards, I think we only have 6 large shunt resistors (I posted a pic recently). Do we need to mod the MSVDD and NVDD ones, or is that specific to the FE PCBs?


----------



## Glottis

What's the real difference between overclocked 2x8pin card vs 3x8pin anyway? What I gathered from reading around it's about 2-3%, not sure if it's worth turning your PC into electric heater, especially when summers get more and more hot each year. My PC with 2x8pin TUF draws up to 600W, so with 3x8pin card that would be over 700W. That's just madness IMO. People bragging about their cards using 450W is understandable for benching, but I think it's a waste for just regular every day gaming.


----------



## chiknnwatrmln

Glottis said:


> What's the real difference between overclocked 2x8pin card vs 3x8pin anyway? What I gathered from reading around it's about 2-3%, not sure if it's worth turning your PC into electric heater, especially when summers get more and more hot each year. My PC with 2x8pin TUF draws up to 600W, so with 3x8pin card that would be over 700W. That's just madness IMO. People bragging about their cards using 450W is understandable for benching, but I think it's a waste for just regular every day gaming.


In my experience (with my 3 pin card) bumping power limit from 320w to 420+w gives about 7-8% performance difference. From 350w to 420+ is like 3%.

Maybe it's a sign of me getting old, but IMO the extra heat output and stress on the components isn't worth it.

Long gone are the days of flashing a BIOS, cranking voltage, turning up clocks and gaining 20% performance.


----------



## Falkentyne

mouacyk said:


> There likely won't ever be a BIOS for 2x pins to surpass it either, for (1) warranty and (2) marketing reasons. If you request the 1000W BIOS, EVGA knows who you are.
> 
> 
> On the Gigabyte boards, I think we only have 6 large shunt resistors (I posted a pic recently). Do we need to mod the MSVDD and NVDD ones, or is that specific to the FE PCBs?


All cards have MSVDD and NVVDD shunts. It's part of the reference spec.

These rails are not shown in HWinfo64 because there is no standardized API for reading rails from smart power stages.
TDP Normalized has access to the rails however. If your TDP % after your shunt mod is something like 75%, your 8 pins maximums (example: 90W, 100W), PCIE Slot Power (ex: 50W) and GPU Chip Power (ex: 150W) are all reporting nice and low, MVDDC is reporting nice and low (ex: 45W), SRC is reporting nice and low (ex: 75W), but your TDP Normalized is still like 120%, that's from the MSVDD/NVVDD rails that you can't see.

I don't know about the 3080 cards, but the 3090 Strix has 3 mOhm shunts on the MSVDD / NVVDD rails, while all the other cards have 5 mOhm shunts there. Those shunts are 1206 package size.


----------



## Muqeshem

acoustic said:


> Yes I get full 450watt in every application if it's capable of drawing that much. I have no issues with being stuck underneath my power limit.
> 
> Here's my Port Royal and Timespy benches:
> 
> Port Royal - 12840: I scored 12 840 in Port Royal
> 
> TimeSpy - 18650: I scored 18 650 in Time Spy
> 
> FireStrike - 33524: I scored 33 524 in Fire Strike
> 
> I run +125/+650 for Port Royal, and then for TimeSpy and FireStrike results, I run at my 24/7 settings of +45/+500(might have been +550, I don't remember lol)
> 
> I have the 240mm EVGA Hybrid cooler on my FTW3.


you are the man I am looking for.
Same setup for the card. FTW3 with the hybird cooler kit. 









I scored 18 671 in Time Spy


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com





Scored 20 010 in gpu.

I think I had core up to 155mhz and memory up to 1100mhz. 
450watt bios. 

I want to ask you about your memory junction temperature, mine never passes 80 degrees with the hybird cooler.
The hybird cooler only cool the gpu core and it cools other components almost passively. 
The freaking fan is away of the memory and all the hot components, very bad design.


----------



## Imprezzion

Colonel_Klinck said:


> That would be a no. Shunt mod is the way those of us with 2x 8 pin have got past the max 375w limit.


Oh well when I order my 470uf caps to fix my broken 2080 Ti I'll just order some shunt resistors as well. Thanks guys for the info so far.

I am hitting consistent 18500-18550 in Time Spy even on air without shunts thanks to my memory doing +1200 without error correcting and the core easily running 2000+ even at 0.962v. I did 6 consecutive runs and all scored within 50 points of each other.

1400 memory sees the score drop hard so I'm too far there.

Won't be for the first few months tho cause I gotta wait 5-6 weeks for my waterblock from eBay to ship here..


----------



## mouacyk

Falkentyne said:


> All cards have MSVDD and NVVDD shunts. It's part of the reference spec.
> 
> These rails are not shown in HWinfo64 because there is no standardized API for reading rails from smart power stages.
> TDP Normalized has access to the rails however. If your TDP % after your shunt mod is something like 75%, your 8 pins maximums (example: 90W, 100W), PCIE Slot Power (ex: 50W) and GPU Chip Power (ex: 150W) are all reporting nice and low, MVDDC is reporting nice and low (ex: 45W), SRC is reporting nice and low (ex: 75W), but your TDP Normalized is still like 120%, that's from the MSVDD/NVVDD rails that you can't see.
> 
> I don't know about the 3080 cards, but the 3090 Strix has 3 mOhm shunts on the MSVDD / NVVDD rails, while all the other cards have 5 mOhm shunts there. Those shunts are 1206 package size.


Do you think the two shunts I circled in red are the ones you're referring to, or are they even smaller (1206)? @bmgjet Can you confirm? I was planning on stacking 15mOhm onto these 5 and the slot shunt on the backside, thinking it should be enough to get around 525W power limit.


----------



## Muqeshem

mouacyk said:


> Do you think the two shunts I circled in red are the ones you're referring to, or are they even smaller (1206)? @bmgjet Can you confirm? I was planning on stacking 15mOhm onto these 5 and the slot shunt on the backside, thinking it should be enough to get around 525W power limit.
> View attachment 2477238



Why shunt when those cards already pull 450 to 500 watt which custom vbios ?
I think it is pointless for day to day usage.


----------



## mouacyk

Muqeshem said:


> Why shunt when those cards already pull 450 to 500 watt which custom vbios ?
> I think it is pointless for day to day usage.


I will pay you $100 for such a BIOS. And return to Amazon all these unused shunting materials.


----------



## Falkentyne

mouacyk said:


> Do you think the two shunts I circled in red are the ones you're referring to, or are they even smaller (1206)? @bmgjet Can you confirm? I was planning on stacking 15mOhm onto these 5 and the slot shunt on the backside, thinking it should be enough to get around 525W power limit.
> View attachment 2477238


a 2512 shunt is 4 times the size of a 1206 shunt.
Here are 1206 shunts next to 2512 shunts.









Here are the 5 mOhm 1206 shunts on a MSI Suprium.



https://www.techpowerup.com/review/msi-geforce-rtx-3090-suprim-x/images/back_full.jpg


----------



## mouacyk

Falkentyne said:


> a 2512 shunt is 4 times the size of a 1206 shunt.
> Here are 1206 shunts next to 2512 shunts.


Are you saying that without finding and modding the 1206 MSVDD and NVVDD shunts also, I won't get the full 525W I wanted. I wonder why @bmgjet 's Easy Shut Mod did not emphasize these smaller shunts...


----------



## Falkentyne

mouacyk said:


> Are you saying that without finding and modding the 1206 MSVDD and NVVDD shunts also, I won't get the full 525W I wanted. I wonder why @bmgjet 's Easy Shut Mod did not emphasize these smaller shunts...


Because no one knew about them until the schematics were found.
They only knew "something" was throttling them that was not responding to the 6 large shunts, and TDP Normalized % saw it.


----------



## mouacyk

Falkentyne said:


> Because no one knew about them until the schematics were found.
> They only knew "something" was throttling them that was not responding to the 6 large shunts, and TDP Normalized % saw it.


Well.. that probably explains why almost everyone who attempted it in this thread was still hitting some limit somewhere, and end up scraping and re-scraping shunts to no effect. Glad I haven't started.

Could I get around it with a BIOS that has PL over 100%? I think the largest percentage one available for 2xpin 3080 is 117%, ASUS TUF. (Is this what you meant in your post How To: Easy mode Shut Modding.) And if I understand the math correctly -- I would still max at 370W * 1.17 = 433W?


----------



## Falkentyne

mouacyk said:


> Well.. that probably explains why almost everyone who attempted it in this thread was still hitting some limit somewhere, and end up scraping and re-scraping shunts to no effect. Glad I haven't started.
> 
> Could I get around it with a BIOS that has PL over 100%? I think the largest percentage one available for 2xpin 3080 is 117%, ASUS TUF. (Is this what you meant in your post How To: Easy mode Shut Modding.)


These shunts have absolutely nothing to do with hitting the 8 pin limits or power balancing issues with badly modded primary shunts.
If you're hitting an 8 pin limit with imbalanced 8 pins, that means something is wrong with the 6 main shunts.

A power limit higher than 100% does give you more power, yes. The MSVDD and NVVDD limits have a 'default' and a 'maximum'. The problem is, the default is based on 100% power limit  The maximum is based on the maximum power limit slider point. But the ampere bios editor doesn't show these limits.

The only thing known is that the Strix has 3 mOhm shunts on these rails, which is why when you regular shunt mod a Strix, you don't get power limit throttling in Timespy Extreme.


----------



## Muqeshem

mouacyk said:


> I will pay you $100 for such a BIOS. And return to Amazon all these unused shunting materials.


Evga 450watt bios on their forum. Wait, I am confused, you might need to breif me a bit about the reason why you want to shunt plz.


----------



## mouacyk

Muqeshem said:


> Evga 450watt bios on their forum. Wait, I am confused, you might need to breif me a bit about the reason why you want to shunt plz.


2x8pin cards are power limited to 375W by any available vBIOS right now. If @Falkentyne 's information is any indication, there's a whole lot of voodoo going on yet, that is throttling cards around 350W and some unexpected levels even after successfully shunting.


----------



## Nizzen

nyk20z3 said:


> If any one has a 3080 Strix OC for sale at MSRP let me know, cash in hand.


LOL


----------



## mouacyk

According to Frame Chasers, he and TechLab didn't need to shunt the 1206 small resistors and they got 3080's to pull 500W.


----------



## Falkentyne

mouacyk said:


> According to Frame Chasers, he and TechLab didn't need to shunt the 1206 small resistors and they got 3080's to pull 500W.


Framechasers is the village clown. Never watch or listen anything he says. He's also a power driven ego-maniac.
The only thing he was ever right about was the PCIE Slot power issue on FTW3's.
And not all cards are going to have the same limits. Since those limits cannot be seen in any editor, we don't know what they are. But we have seen people with perfectly solder modded 3080's in the shunt mod thread hit a hard wall at 380W even when crossflashing another Bios, and that person experimented with different resistances, and even though TDP went way down, the card still throttled at a % of the TDP ending up at 380W true draw.

If you can identify the 1206 shunts on your card, these are the ones you need for stacking.


https://www.mouser.com/ProductDetail/71-WSLP12065L000FEA/


If your card has three 1206 shunts, you need to take a multimeter and identify the two that are connected to larger 2512 shunts.
Sometimes, one of the three shunts are connected to some fan or RGB controller, and I have no idea what would happen if you modded that, if all three must be modded, or if that one must not be modded.

Note two things here.
1) it is still NOT confirmed that shunting these will remove the limits that matter, because it is still NOT confirmed that those are the limits affecting the cards! It is only confirmed that those shunts deal with MSVDD and NVVDD limits, but we don't know what the limits are since we can't see them. That's all. It's entirely conceivable that those rails draw so little power they never reach their limits. But until someone with proper testing equipment can debug that, well, we don't know yet.

2) It is only confirmed that the Strix has 3 mOhm shunts for those rails, while all the other cards have 5 mOhm. And it is confirmed that shunt modding the main shunts on the Strix allows it to not power throttle in Timespy Extreme. This can be from either higher BIOS limits or from the 3 mOhm shunts. Remember the Strix does have a 648W GPU Chip Power limit in its bios.

I'm not soldering those super tiny shunts to my FE right now. I don't feel like taking apart my card for what really isn't an essential mod on air cooling, especially with a painful disability.


----------



## Imprezzion

Would it be better for overall performance to curve limit the voltage and set the clocks to as high as they'll go on that given voltage and minimize or even eliminate power throttling all together or should I just let the card decide and just run a basic offset core clock and just let it regulate the clocks and voltage on it's own sitting at power limit the entire time.

Right now I run +105 core and +1200 memory (yes, it's stable and doesn't hit error correction) but clocks and voltage go all over the place while gaming obviously going from as high as 2145Mhz @ 1.100v at light loads to as low as 1980Mhz @ 0.962v under super heavy loads like 3DMark. In like, Division 2 it runs anywhere from 2025Mhz @ 1.000v to 2085Mhz @ 1.062v generally. Would it for example perform better overall if I limit it to say, 1.012v and see how high of a clockspeed it would do there?

It's stable at +105 but +120 sees it going to 2160Mhz in menus and lighter loads which it doesn't really sustain without crashes if voltage drops below 1.100v without dropping clocks fast enough. It runs stable +120 under a load as long as the load doesn't suddenly drop causing the clocks to overshoot.

Judging from what I saw running World of Tanks with uncapped FPS well over 300 the card has no issues running well above 2000Mhz even under 1v at 0.987v or whatever that bin is.


----------



## EarlZ

I have an Aorus Master 3080 Rev2 and I can see the Tjunction for VRAM goes to 100c, how are you guys even getting +1200Mhz stable on mem oc with air cooling ?


----------



## Glottis

ssgwright said:


> it is, saw big improvements in oc with the new bios


I updated my TUF bios and noticed no difference whatsoever. ¯\_(ツ)_/¯


----------



## Imprezzion

EarlZ said:


> I have an Aorus Master 3080 Rev2 and I can see the Tjunction for VRAM goes to 100c, how are you guys even getting +1200Mhz stable on mem oc with air cooling ?


Extra thermal pads between card and backplate, a lot of luck and 86c junction still? Hehe.

It did time spy with a graphics score of 18569 on +105 / +1200 as a comparison.

EDIT: Man Cyberpunk 2077 is really annoying.. The 2080 Ti had the same problem, stable on a certain clock in any other game but Cyberpunk, this 3080 does the same.
Obviously it's harder to see which clock is unstable as both clocks and voltage is jumping around wildly due to power but +105 and +90 core are a no-go it will crash in minutes in CP2077 while other games run it forever. Usually when the voltage drops to 0.987v or lower and core clock is still above 2010Mhz it will crash.. Running +60 core eliminates this problem so far (tested for like 10 minutes but k) but it also won't boost nearly as high. 

Weird thing is, when i used the Curve to clock it to 2025Mhz @ 1.025v it instantly crashed but it will run that just fine under automatic boost. Weird.


----------



## EarlZ

Imprezzion said:


> Extra thermal pads between card and backplate, a lot of luck and 86c junction still? Hehe.
> 
> It did time spy with a graphics score of 18569 on +105 / +1200 as a comparison.
> 
> EDIT: Man Cyberpunk 2077 is really annoying.. The 2080 Ti had the same problem, stable on a certain clock in any other game but Cyberpunk, this 3080 does the same.
> Obviously it's harder to see which clock is unstable as both clocks and voltage is jumping around wildly due to power but +105 and +90 core are a no-go it will crash in minutes in CP2077 while other games run it forever. Usually when the voltage drops to 0.987v or lower and core clock is still above 2010Mhz it will crash.. Running +60 core eliminates this problem so far (tested for like 10 minutes but k) but it also won't boost nearly as high.
> 
> Weird thing is, when i used the Curve to clock it to 2025Mhz @ 1.025v it instantly crashed but it will run that just fine under automatic boost. Weird.


Which brand of thermal pads are you using? My backplate is already so hot that I cant even thouch it for more than 2seconds.


----------



## Imprezzion

The thick blue Arctic ones that came with my old Arctic Accelero IV Xtreme backplate. Perfect thickness. I might even put that entire backplate on the card.. I should have it somewhere.


----------



## Hirtle

@Falkentyne I remember you mentioned something about being unsure if the Strix 3080 has the 1206 shunts. Mine does not.


----------



## ssgwright

Glottis said:


> I updated my TUF bios and noticed no difference whatsoever. ¯\_(ツ)_/¯


maybe I did because I'm shunnted? IDK


----------



## MrKenzie

acoustic said:


> Has anyone figured out what's causing some cards to be 30-40w under the actual power limit? My FTW3 Ultra will hit 450w all day long, and in games like Metro Exodus with large transient spikes, I've seen upwards of 475-480watts while the card adjusts voltages to get back under the limit.


My 3080 would hit 450W with the stock cooler, now that I have it water cooled it almost never goes above 435W. I think it's more efficient due to lower temps, and also no fan power draw.


----------



## Muqeshem

Muqeshem said:


> you are the man I am looking for.
> Same setup for the card. FTW3 with the hybird cooler kit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 671 in Time Spy
> 
> 
> Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Scored 20 010 in gpu.
> 
> I think I had core up to 155mhz and memory up to 1100mhz.
> 450watt bios.
> 
> I want to ask you about your memory junction temperature, mine never passes 80 degrees with the hybird cooler.
> The hybird cooler only cool the gpu core and it cools other components almost passively.
> The freaking fan is away of the memory and all the hot components, very bad design.


Please replay acoustic. I messaged serveral times in DM.


----------



## DrWaffles

acoustic said:


> Has anyone figured out what's causing some cards to be 30-40w under the actual power limit? My FTW3 Ultra will hit 450w all day long, and in games like Metro Exodus with large transient spikes, I've seen upwards of 475-480watts while the card adjusts voltages to get back under the limit.


Mine does this, it always starts throttling voltage around 410w.

Also where the hell is everybody getting Ampere bios editor from? I wouldn't mind comparing some of the roms to find the best for my FTW3 Ultra.. The 450w bios gives me the ****s.


----------



## Imprezzion

Same here, it depends on the game but my 370w BIOS usually throttles around 345-355w. I've only once seen it do the full 370w and that was in 3DMark Time Spy. Not even Superposition does the full 370w.

It usually limits at the PCI-E power judging from HWINFO64. So maybe the BIOS doesn't allow more than 60w as mine never goes over 60w for PCI-E. It does hit 150w for both the 8 pins from time to time.


----------



## acoustic

MrKenzie said:


> My 3080 would hit 450W with the stock cooler, now that I have it water cooled it almost never goes above 435W. I think it's more efficient due to lower temps, and also no fan power draw.


Yup. I have my Hybrid cooler fans plugged directly into the motherboard, and with the MSI BIOS, have them scaling with the Mosfet temps; after some extended gaming they'll spin up from 1050rpm to 1250rpm. Works great and silent.



Muqeshem said:


> Please replay acoustic. I messaged serveral times in DM.


Sorry, I was busy and I'm not super active here.

Running Timespy with my normal 24/7 settings: 450watt BIOS, +45core/+500mem, GPU rad fans @ 1050rpm, and the Hybrid fan on the PCB on 1:1 curve (never got above 55% in this run) .. GPU Memory Junction temp maxed @ 78c. I think people are getting way too caught up with this junction temp crap. EVGA cards have ICX and you can see the memory temps, those are more important imo. Memory junction maxed @ 78c, but only MEM1 exceeded 60c.

I also have bum memory that won't go above +650. I have lower temps with the Hybrid cooler on my memory than I did with the stock cooler.



DrWaffles said:


> Mine does this, it always starts throttling voltage around 410w.


Strange. I mean, it makes sense for the card to start throttling slightly before the power limit in order to stop transient spikes causing the wattage to completely blow the power limit, but it does seem like some cards don't care as much. Like I said, my card will hit 475-480watts sometimes with those transient loads, and playing Metro Exodus, I'll see 440+watts steady pretty much the entire time playing. It's pretty nuts.


----------



## DrWaffles

Imprezzion said:


> Same here, it depends on the game but my 370w BIOS usually throttles around 345-355w. I've only once seen it do the full 370w and that was in 3DMark Time Spy. Not even Superposition does the full 370w.
> 
> It usually limits at the PCI-E power judging from HWINFO64. So maybe the BIOS doesn't allow more than 60w as mine never goes over 60w for PCI-E. It does hit 150w for both the 8 pins from time to time.


I don't think it's a slot limit, as mine used to hover around 45w.. There was a strange load balancing issue that might be the cause?.. The first and or second plug would draw significantly more than the third plug.
Cards on water too, so it's not temp related.
I've seen it in the 440's before but it won't do that at full voltage and clocks, it's usually already down at 2000mhz instead of the 2100-2130 I can get at 1100mv.


----------



## DrWaffles

Took the global 9900KF/3080 scores last night boys!
Just regular water cooling with a quick and dirty 5.2ghz 1.35v OC. 
Memory had a bit of time tuning as the mobo configured it terribly, even though it was QVL'd.

I'm pretty chuffed!


----------



## Hirtle

After pulling my card out again and re-doing all the shunts, I ran (a custom run of) Port Royal and observed the reported power draw to be the expected values for a properly shunt modded card (see Falkantyne's post on page 192). I let it cool off for a few minutes and went for another run at Port Royal. I got a score similar to my personal record (13007 vs. 130047). I checked GPUZ again, only to discover that the card was back at it's original power limits. I think the paint I'm using is shrinking as it dries and causing the connection between the two shunts to fail. It's MG842AR, which was so highly recommended in the "How to: Easy mode shunt modding" thread. I'm going to just solder it at some point in the future. Using this paint to attach the shunts is not working for me.


----------



## Falkentyne

Hirtle said:


> After pulling my card out again and re-doing all the shunts, I ran (a custom run of) Port Royal and observed the reported power draw to be the expected values for a properly shunt modded card (see Falkantyne's post on page 192). I let it cool off for a few minutes and went for another run at Port Royal. I got a score similar to my personal record (13007 vs. 130047). I checked GPUZ again, only to discover that the card was back at it's original power limits. I think the paint I'm using is shrinking as it dries and causing the connection between the two shunts to fail. It's MG842AR, which was so highly recommended in the "How to: Easy mode shunt modding" thread. I'm going to just solder it at some point in the future. Using this paint to attach the shunts is not working for me.


Ideally you're supposed to bake the paint application at about 60-75C for 30 minutes, and then apply a second later, and then repeat. That way the paint cures properly. Of course something frightens me about putting an expensive video card in an Oven, but you know...

According to the MG docs, air curing requires 24 hours of settling.

I have no idea about the effect of loading a card with current through freshly painted shunts--that's beyond me.
Sky3900, in the main shunt mod thread, baked his card, I think at 175F for 30 minutes before each paint layer. He didn't experience a problem like this.


----------



## Falkentyne

Hirtle said:


> After pulling my card out again and re-doing all the shunts, I ran (a custom run of) Port Royal and observed the reported power draw to be the expected values for a properly shunt modded card (see Falkantyne's post on page 192). I let it cool off for a few minutes and went for another run at Port Royal. I got a score similar to my personal record (13007 vs. 130047). I checked GPUZ again, only to discover that the card was back at it's original power limits. I think the paint I'm using is shrinking as it dries and causing the connection between the two shunts to fail. It's MG842AR, which was so highly recommended in the "How to: Easy mode shunt modding" thread. I'm going to just solder it at some point in the future. Using this paint to attach the shunts is not working for me.


The great Elmor just told me he doesn't recommend paint long term. Only for testing.
Main issue is all the heat and current that can go through the 8 pins (and chip power).
Soldering is difficult at first but once you get a good iron, good Kapton 3M tape (this is ESSENTIAL!!!!) and good flux, and are not afraid to heat the edges of the shunt, it's MUCH easier in the end!


----------



## ducegt

Anyone with a Suprim (or flashed Suprim BIOS) able to get the voltage over 1075mV?


----------



## ssgwright

without shunt modding my card I wasn't able to get above 1.075 and that's with an ASUS TUF


----------



## EarlZ

Anyone here with a Gigabyte Aorus Master Rev2 decided to open up the card and replace the thermal pads?

I've seen some posts on reddit that the Rev1 cards do not have a thermal pad on the GPU black plate and those on the vram show a lot of 'oily' residue. I am not sure if its the same case for the Rev 2 but I can see there is a thermal pad on the back plate and it becomes so hot to the touch that I cant even put my finger on it for more than 1 second.


----------



## Imprezzion

I have a Gigabyte Gaming OC which didn't have thermal pads to the backplate but I added them myself. Helps a bit, 5-8c cooler on the VRAM but the backplate is kinda small so I might replace the backplate with the Arctic Accelero IV Extreme one for a bit better cooling. You can order those for less than €20 so.

Also, voltage, I can get 1.100v on the Gaming OC just fine as long as I don't hit power limit.


----------



## EarlZ

Imprezzion said:


> I have a Gigabyte Gaming OC which didn't have thermal pads to the backplate but I added them myself. Helps a bit, 5-8c cooler on the VRAM but the backplate is kinda small so I might replace the backplate with the Arctic Accelero IV Extreme one for a bit better cooling. You can order those for less than €20 so.
> 
> Also, voltage, I can get 1.100v on the Gaming OC just fine as long as I don't hit power limit.


I am using a Noctua D15 so there is a very limited space for the backplate. I am wanting to change the thermal pads on mine but I am not 100% sure if its 2mm for the VRAM side and 3mm for its back side and how many 80x40mm pads I would need.


----------



## Imprezzion

For my Gaming OC the backplate required 3mm pads. The Arctic ones are 3,5mm but the mount screws pull them flat nicely. 

One issue tho.. if I wanna put on the Arctic backplate then I ha e to obviously remove the stock one which means the PCI-E 8 pin adapters aren't supported anymore as they are mounted to the backplate. 

I'd have to figure something out for that.. 

Or be a total moron and chop up the stock backplate. I wouldn't put it past myself to do that lol..

I got some stuff underway, extra thermal pads, the backplate, a new tube of Kryonaut as I ran out of PK3 which you can't buy anymore, 5 boxes of Coollaboratory Liquid Ultra (still prefer it over Conductonaut by a country mile) and a 10900K Delid kit from Rockitcool. 

And of course a bunch of caps and resistors from Panasonic so I can shunt it and fix my broken 2080 Ti in the process lol.


----------



## Hirtle

Falkentyne said:


> The great Elmor just told me he doesn't recommend paint long term. Only for testing.
> Main issue is all the heat and current that can go through the 8 pins (and chip power).
> Soldering is difficult at first but once you get a good iron, good Kapton 3M tape (this is ESSENTIAL!!!!) and good flux, and are not afraid to heat the edges of the shunt, it's MUCH easier in the end!


Thanks Falkentyne. I'm certainly not afraid to solder the card, I've done it before. I just thought it would be nice to have a mod that could so easily be reversible. 

About Elmor, I've been wanting to get an EVC2 but I'm not quite sure how to hook it up. I'm guessing it connects to the three PCON1, PCON2, and SCON1 (which I'm also assuming is for Vcore, MSVDD, and memory, respectively). It seems that the new EVC2SX is capable of controlling all three at once? And how do I go about connecting all three to the EVC2?


----------



## DaftConspiracy

Colonel_Klinck said:


> Ok I did the 2x 8pin shunts again. Still not perfect but its now a 7 to 8w difference instead of 20w. Can't be bothered to drain the loop and strip it again today but will have another attempt maybe at the weekend. Next time I think I'll scape off all the old paint and start again. Maybe the difference is thickness of paint as I've just applied more on top.


Try something more conductive like anti-seize. It'll be easier to remove down the road too.

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

Hirtle said:


> I'm still having issues shunt modding my card. I pulled all the added shunts off, scraped the stock ones and the new ones, then reapplied with MG842AR. It looks like the mod worked for MVDDC and PWR_SRC, but not for the 8 pins. I think I'll abandon the idea of applying them with the MG842AR and solder them instead. Here's a comparison while running Furmark. The left side is after my attempt to shunt mod it and the right side is stock.
> 
> View attachment 2476993


270w on the core is quite a bit. Furmark will always hit power limit regardless of how high it is. Try Port Royal, that puts a pretty typical power draw on the card.

Sent from my IN2025 using Tapatalk


----------



## Hirtle

DaftConspiracy said:


> 270w on the core is quite a bit. Furmark will always hit power limit regardless of how high it is. Try Port Royal, that puts a pretty typical power draw on the card.
> 
> Sent from my IN2025 using Tapatalk


The point of that post was that the reported power draw wasn't reading correctly. If the shunt mod had been successful, it would have been lower. That is typical for a 3080 running Furmark. If you'll see my latest post regarding this, I ran Port Royal after re-doing the shunt mod and it responded as expected for a minute before, presumably, the paint failed. The only reason I use Furmark is because, in my experience, it produces the most consistent power draw for any benchmark I've tried.


----------



## Colonel_Klinck

DaftConspiracy said:


> Try something more conductive like anti-seize. It'll be easier to remove down the road too.
> 
> Sent from my IN2025 using Tapatalk


What like copper grease you mean?


----------



## Falkentyne

Colonel_Klinck said:


> What like copper grease you mean?


Just solder.
It's a LOT safer in the end than paint. (even though it may not seem like it!)
What you need to do is to do what too many people are NOT doing--buy some high quality Kapton tape. 3M is the best.

3M High Temp Polymide tape. You can buy 1/4", 1/2" or 3/4" wide reels, but 1/4" is probably best because there's' less work involved--you can get the tape in between tiny places or wrap it on the side of MLCC caps easier when you have a smaller place to cover. And you simply tape completely around the shunt and any areas within a few inches. Doing this will protect your PCB from shorts.
It's worth it buying the 3M stuff rather than generic stuff. The adhesion is miles better on the 3M stuff. But always clean the area you are taping around with 100% (or 91%) alcohol first--makes a better grip.

Then you need a high quality soldering iron. A TS100 is the best choice for new users who want a temperature regulated iron (it reads the temp from the tip that is installed), with a 24V power supply.
Having a nice 65W iron makes soldering MUCH easier than a starter kit with a 25W iron that isn't temperature regulated. If you're finding soldering too hard and you have a starter kit, well--that's probably why!






UY CHAN Upgraded Original TS100 Digital OLED Programmable Pocket-size Smart Mini Outdoor Portable Soldering Iron Station Kit Embedded Interface DC5525 Acceleration Sensors STM32 Chip Fast Heat (B2) - - Amazon.com


UY CHAN Upgraded Original TS100 Digital OLED Programmable Pocket-size Smart Mini Outdoor Portable Soldering Iron Station Kit Embedded Interface DC5525 Acceleration Sensors STM32 Chip Fast Heat (B2) - - Amazon.com



www.amazon.com





Then you need some high quality rosin flux. Flux is what helps direct the solder and is essential. You apply flux between each and every soldering pass. You can even desolder a stacked shunt (e.g. to return to stock) this way also, by just fluxing the shunt, then applying the iron (that does not have solder on the tip) to the side of the shunt, the solder will migrate to the iron, then dip the tip into the cleaning/tinning bronze coils, apply more flux, and repeat until enough solder is removed so you can move the stacked shunt with tweezers. (note: desoldering an original shunt is HARD and very advanced work--sometimes it requires TWO irons, one applied to each side of the shunt to melt the original solder pads evenly, don't bother with this now), so all you want to do is just stack).

Then good solder wire, like Kester 60/40 solder. This is good stuff.

The key is, and this is what scares new people at the end--apply flux to the edges of the original shunt edges,
Then apply a small amount of solder from your kester wire to the tip of the iron, and then---what scares people--
Touch the iron to the edge of the shunt (350C is a good temp), and heat it. The solder won't flow right away. If you tried putting solder on an open shunt on a work bench or countertop, you will see the solder flows instantly. It doesn't do that on PCB's because the PCB absorbs much of the heat ! The heat goes through the shunt into the PCB and gets dissipated! So you need to heat the edges of the shunt enough so that the solder will flow and stick to the edge. Once you heat it enough, you will see a small amount of solder start to stick. Then you just move the iron with small movements around the edge of the shunt, maybe rotating the tip a bit to get more solder on, until you have a nice small (not too large!!) elongated solder joint at the edge of your shunt.

If you get too much on and you made a small mountain, just flux (remember, always flux), tin the iron then apply it to the top of the solder and part of it will go back to the iron and you can slowly mold it back into shape. 

Then do the same for the other edge of the shunt. Once you have a small amount of solder on the two edges in a nice, round small ridge, then you flux (AGAIN flux), take the shunt you are stacking, hold it lightly with tweezers on top of the two solder joints you created, tin the iron, apply solder to the iron tip again, then wipe the iron around the very edges of both shunts and the top (do one side at a time of course), until the solder melts and forms a solid seal on that side. Then you switch to the second side (which will be easier once you have one side done), melt the solder and bond it while pressing down lightly on the shunt with your tweezers, and bam: you're done.

Once you get used to it, you will fid it easier to work with than paint, because it's FAR less messy. The solder goes where the flux is (from the tip to the flux). Flux is magic.
Clean the work with isopropyl after you're done.

And careful application of 3M tape will stop shorts around your work.

Buying good equipment makes soldering much easier and safer than having bad equipment.


----------



## DaftConspiracy

Colonel_Klinck said:


> What like copper grease you mean?


Yes, copper or nickel anti-seize meant for bolts. Should come off easily enough with some rubbing alcohol and it's a paste so you don't have to worry about it moving as long as you don't glob on excess. It should have a much higher conductivity than any paint you can find. 





Falkentyne said:


> Just solder.
> It's a LOT safer in the end than paint. (even though it may not seem like it!)
> What you need to do is to do what too many people are NOT doing--buy some high quality Kapton tape. 3M is the best.
> 
> 3M High Temp Polymide tape. You can buy 1/4", 1/2" or 3/4" wide reels, but 1/4" is probably best because there's' less work involved--you can get the tape in between tiny places or wrap it on the side of MLCC caps easier when you have a smaller place to cover. And you simply tape completely around the shunt and any areas within a few inches. Doing this will protect your PCB from shorts.
> It's worth it buying the 3M stuff rather than generic stuff. The adhesion is miles better on the 3M stuff. But always clean the area you are taping around with 100% (or 91%) alcohol first--makes a better grip.
> 
> Then you need a high quality soldering iron. A TS100 is the best choice for new users who want a temperature regulated iron (it reads the temp from the tip that is installed), with a 24V power supply.
> Having a nice 65W iron makes soldering MUCH easier than a starter kit with a 25W iron that isn't temperature regulated. If you're finding soldering too hard and you have a starter kit, well--that's probably why!
> 
> 
> 
> 
> 
> 
> UY CHAN Upgraded Original TS100 Digital OLED Programmable Pocket-size Smart Mini Outdoor Portable Soldering Iron Station Kit Embedded Interface DC5525 Acceleration Sensors STM32 Chip Fast Heat (B2) - - Amazon.com
> 
> 
> UY CHAN Upgraded Original TS100 Digital OLED Programmable Pocket-size Smart Mini Outdoor Portable Soldering Iron Station Kit Embedded Interface DC5525 Acceleration Sensors STM32 Chip Fast Heat (B2) - - Amazon.com
> 
> 
> 
> www.amazon.com
> 
> 
> 
> 
> 
> Then you need some high quality rosin flux. Flux is what helps direct the solder and is essential. You apply flux between each and every soldering pass. You can even desolder a stacked shunt (e.g. to return to stock) this way also, by just fluxing the shunt, then applying the iron (that does not have solder on the tip) to the side of the shunt, the solder will migrate to the iron, then dip the tip into the cleaning/tinning bronze coils, apply more flux, and repeat until enough solder is removed so you can move the stacked shunt with tweezers. (note: desoldering an original shunt is HARD and very advanced work--sometimes it requires TWO irons, one applied to each side of the shunt to melt the original solder pads evenly, don't bother with this now), so all you want to do is just stack).
> 
> Then good solder wire, like Kester 60/40 solder. This is good stuff.
> 
> The key is, and this is what scares new people at the end--apply flux to the edges of the original shunt edges,
> Then apply a small amount of solder from your kester wire to the tip of the iron, and then---what scares people--
> Touch the iron to the edge of the shunt (350C is a good temp), and heat it. The solder won't flow right away. If you tried putting solder on an open shunt on a work bench or countertop, you will see the solder flows instantly. It doesn't do that on PCB's because the PCB absorbs much of the heat ! The heat goes through the shunt into the PCB and gets dissipated! So you need to heat the edges of the shunt enough so that the solder will flow and stick to the edge. Once you heat it enough, you will see a small amount of solder start to stick. Then you just move the iron with small movements around the edge of the shunt, maybe rotating the tip a bit to get more solder on, until you have a nice small (not too large!!) elongated solder joint at the edge of your shunt.
> 
> If you get too much on and you made a small mountain, just flux (remember, always flux), tin the iron then apply it to the top of the solder and part of it will go back to the iron and you can slowly mold it back into shape.
> 
> Then do the same for the other edge of the shunt. Once you have a small amount of solder on the two edges in a nice, round small ridge, then you flux (AGAIN flux), take the shunt you are stacking, hold it lightly with tweezers on top of the two solder joints you created, tin the iron, apply solder to the iron tip again, then wipe the iron around the very edges of both shunts and the top (do one side at a time of course), until the solder melts and forms a solid seal on that side. Then you switch to the second side (which will be easier once you have one side done), melt the solder and bond it while pressing down lightly on the shunt with your tweezers, and bam: you're done.
> 
> Once you get used to it, you will fid it easier to work with than paint, because it's FAR less messy. The solder goes where the flux is (from the tip to the flux). Flux is magic.
> Clean the work with isopropyl after you're done.
> 
> And careful application of 3M tape will stop shorts around your work.
> 
> Buying good equipment makes soldering much easier and safer than having bad equipment.


I don't understand the point of the tape, if you're soldering there's no risk of shorts since the solder is fixed. Also I recommend using desoldering wick instead of relying on the iron to remove it. The flux itself doesn't direct solder flow at all, it's just surface prep (though it is very important that it's used).

Sent from my IN2025 using Tapatalk


----------



## Falkentyne

DaftConspiracy said:


> Yes, copper or nickel anti-seize meant for bolts. Should come off easily enough with some rubbing alcohol and it's a paste so you don't have to worry about it moving as long as you don't glob on excess. It should have a much higher conductivity than any paint you can find.
> 
> 
> 
> 
> 
> I don't understand the point of the tape, if you're soldering there's no risk of shorts since the solder is fixed. Also I recommend using desoldering wick instead of relying on the iron to remove it. The flux itself doesn't direct solder flow at all, it's just surface prep (though it is very important that it's used).
> 
> Sent from my IN2025 using Tapatalk


We're talking about shacking shunts, not about removing original shunts. Removing original shunts is an extremely difficult job. Stacking shunts is easy.
Desoldering wick is a good idea for accidents.

ALWAYS tape. ALWAYS. Some boards have less than 1mm to work with between shunts and tiny components!

You NEVER know when the solder might drip someplace it shouldn't get to.
You NEVER know if the work will shift to the side and end up bridging another component.
MULTIPLE users here have suffered soldering bridge nightmares. Sky3900 had to spend an hour removing a bridge when he was attempting to desolder original shunts in the Shunt Mod thread. His card survived.
Dante`afk shorted and killed a board by bridging components south of the PCIE Shunt resistor.
I was saved several times myself just from having Kapton tape on the board.
If you have money for a 3080 or 3090, you have money for Kapton tape. There is NO excuse. It's just smart.

Thank you for the advice on desoldering wick. Just ordered some.


----------



## Colonel_Klinck

DaftConspiracy said:


> Yes, copper or nickel anti-seize meant for bolts. Should come off easily enough with some rubbing alcohol and it's a paste so you don't have to worry about it moving as long as you don't glob on excess. It should have a much higher conductivity than any paint you can find.


I never even considered that but it does have a very high copper content. I have a tub of it in my garage. I might try it if I take the shunts off my card and reattach them or on the 3080ti when it finally starts shipping.


----------



## DaftConspiracy

Falkentyne said:


> We're talking about shacking shunts, not about removing original shunts. Removing original shunts is an extremely difficult job. Stacking shunts is easy.
> Desoldering wick is a good idea for accidents.
> 
> ALWAYS tape. ALWAYS. Some boards have less than 1mm to work with between shunts and tiny components!
> 
> You NEVER know when the solder might drip someplace it shouldn't get to.
> You NEVER know if the work will shift to the side and end up bridging another component.
> MULTIPLE users here have suffered soldering bridge nightmares. Sky3900 had to spend an hour removing a bridge when he was attempting to desolder original shunts in the Shunt Mod thread. His card survived.
> Dante`afk shorted and killed a board by bridging components south of the PCIE Shunt resistor.
> I was saved several times myself just from having Kapton tape on the board.
> If you have money for a 3080 or 3090, you have money for Kapton tape. There is NO excuse. It's just smart.
> 
> Thank you for the advice on desoldering wick. Just ordered some.


For beginners I could definitely see the tape being useful, but as someone with 15 years of experience soldering it wouldn't serve much purpose to me. As long as the tip so the right size and the solder is applied correctly there shouldn't be any problem with excess solder. As long as you tin the contacts on both resistors before hand you can use some tweezers to hold the new resistors in place with one hand while you use the iron with the other. Once the first contact is tacked on you won't need to hold the resistor any more, and hardly any additional solder will be required after tacking them on. I haven't seen any cards that have components in the immediate vicinity of the current sense resistors, but I suppose they could be out there.

Sent from my IN2025 using Tapatalk


----------



## wkdsean88

5600x & 3080 Gaming Trio X
16440 Overall Score
Gpu Score 19360 Cpu score 8865
I scored 16 440 in Time Spy

Suprim x Bios


----------



## EarlZ

No GPU overclock on my 3080, getting 11,700 on 1080p extreme super position and 8k optimized is 6500, looks normal ?


----------



## Hirtle

DaftConspiracy said:


> I haven't seen any cards that have components in the immediate vicinity of the current sense resistors, but I suppose they could be out there.
> 
> Sent from my IN2025 using Tapatalk


Have you not seen an FE card?


----------



## DaftConspiracy

Hirtle said:


> Have you not seen an FE card?


Nope

Sent from my IN2025 using Tapatalk


----------



## Imprezzion

Is there a way to fine-tune the curve? I can get super high overclocks on the core offset wise (+150 currently) which is stable at 1.093-1.100v but whenever I play a game that throttles to for example 0.987-1.006v the same offset of +150 is not stable at that specific throttled voltage. .

It's very annoying to play Cyberpunk with DLSS Quality at like 2145Mhz as it barely throttles if at all at 1080p all Max, RT Psycho, DLSS Quality but then start up Division 2 and crash because it cannot do the 2040Mhz at 0.987v that the card expects it to do at +150 offset.


----------



## blurp

Imprezzion said:


> Is there a way to fine-tune the curve? I can get super high overclocks on the core offset wise (+150 currently) which is stable at 1.093-1.100v but whenever I play a game that throttles to for example 0.987-1.006v the same offset of +150 is not stable at that specific throttled voltage. .
> 
> It's very annoying to play Cyberpunk with DLSS Quality at like 2145Mhz as it barely throttles if at all at 1080p all Max, RT Psycho, DLSS Quality but then start up Division 2 and crash because it cannot do the 2040Mhz at 0.987v that the card expects it to do at +150 offset.


Read this:





Rtx 3000 series undervolt discussion


Let's talk undervolting. Creating a custom voltage curve should both reduce power usage and possibly increase performance (if it reduces Temps enough to get you into higher boost bins). Here is the guide I have created, thanks to everyone who gave feedback and helped get this tested and revised...



hardforum.com


----------



## leegoocrap

so we're off the paint train? 
Just when I got some nice even coats


----------



## mouacyk

leegoocrap said:


> so we're off the paint train?
> Just when I got some nice even coats


Be nice to get multiple samples, not just one.


----------



## Mr Ripper

What's the best way of comparing 2 same make / model cards for best performance? I was thinking of doing the following:

Run Ampere memory tester on both see, which runs at a higher speed without errors.
Test highest clock?
Test set clock speeds and see which takes lower voltage?

I want to water cool the better one and sell the other. Any suggestions appreciate - Thanks.


----------



## BluePaint

Use AB to fix voltage to something like 1050mv and put fans to 100%. Run heaven benchmark in a window and pause it in the same place. Then do 2 test runs (one for VRAM and one for core), adjusting clocks until fps go down (VRAM) or driver crashes (core).
Don't have experience with memory tester, so can't comment on that. Could be useful. Maybe u can test with both methods for VRAM and compare results.


----------



## Imprezzion

Same as I did. I just paused Superposition at a static spot and cranked the VRAM and watched FPS rise until it stopped and eventually went down. Then I ran 3DMark Time Spy on loads of different VRAM clocks and those tests matched the Superposition tests score wise and clock wise. 

So yeah, +1200 on my VRAM is the max I can do with it still scaling. +1250 or +1300 same scores within margin of error and +1350 and up drops hard.

Core will do 2145Mhz as long as it gets full voltage but I usually let it run it's automatic boost with +75 core clock offset which is about 2085Mhz as with the 370w power limit (355w effective) it never gets the full 1.100v and now sits around 1.043-1.056v at 2055-2085Mhz in most games.


----------



## Mr Ripper

On my MSI Gaming X Trio with Suprim bios:
I think around the 2160mhz is my limit for my current card in Heaven at least. Extra voltage over 1.05v doesn't really help and actually causes pausing and crashing in Heaven so might be something other than the core overheating (core is under 60°c).

+900 vram is my max without errors on the memory tester. I thought I'd try 1100 and Time Spy crashes. Max graphics score I've managed is 19k. Lower than +900 vram doesn't improve my score.

Lets see how the other compares tomorrow.


----------



## VPII

Mr Ripper said:


> On my MSI Gaming X Trio with Suprim bios:
> I think around the 2160mhz is my limit for my current card in Heaven at least. Extra voltage over 1.05v doesn't really help and actually causes pausing and crashing in Heaven so might be something other than the core overheating (core is under 60°c).
> 
> +900 vram is my max without errors on the memory tester. I thought I'd try 1100 and Time Spy crashes. Max graphics score I've managed is 19k. Lower than +900 vram doesn't improve my score.
> 
> Lets see how the other compares tomorrow.


Interesting....

With my MSI Gaming X Trio using the stock bios I can push the core up to +165mhz which in effect is 2160mhz core but when I use the MSI Suprim X bios I can only run +105mhz which in effect gives 2100 core. Score in Time Spy is a fare bit higher though but memory is running +1200mhz


----------



## Imprezzion

Max I can pull graphics score wise with my 2x8pin Gigabyte Gaming OC is about 18500-18600 due to power throttling. Average clock is only 2040-2055Mhz. Memory +1200. I might offer it up for trade locally for a random 3x8 pin model lol.


----------



## Mr Ripper

VPII said:


> Interesting....
> 
> With my MSI Gaming X Trio using the stock bios I can push the core up to +165mhz which in effect is 2160mhz core but when I use the MSI Suprim X bios I can only run +105mhz which in effect gives 2100 core. Score in Time Spy is a fare bit higher though but memory is running +1200mhz


Looking at my results my ~19k Time Spy score which says 2100 max clock 2091 average. I think this is when I locked the frequency.
I have a 18992 score where I was using a custom curve with a max voltage of 1.031v which says 2115 max / 2054 average - I think this was max 2145 on the custom curve.

I didn't really play around with the standard bios - Is it generally the case the Suprim bios hinders max clock?


----------



## EarlZ

Where can I get this Ampere memory tester and is there a trick to the MSI voltage/freq curve to flatten/align the curve after my desired max value, for example I've selected 2040Mhz on 0.975, other voltage levels still have higher clocks and its a PIA to adjust all of them.


----------



## mouacyk

EarlZ said:


> Where can I get this Ampere memory tester and is there a trick to the MSI voltage/freq curve to flatten/align the curve after my desired max value, for example I've selected 2040Mhz on 0.975, other voltage levels still have higher clocks and its a PIA to adjust all of them.


Download in description:


----------



## Imprezzion

K so, I have to order some SMD stuff and wanna order a bunch of shunt resistors for shunt modding "just in case". Which resistance, form factor and amount do I need for a 2x8 pin Gigabyte Gaming OC?

I assume 1206 and 8mohm?


----------



## Falkentyne

Imprezzion said:


> K so, I have to order some SMD stuff and wanna order a bunch of shunt resistors for shunt modding "just in case". Which resistance, form factor and amount do I need for a 2x8 pin Gigabyte Gaming OC?
> 
> I assume 1206 and 8mohm?


No, you need 2512 and 8 mOhms.

If you want to yeet it with 5 mOhms you can get these.

https://www.mouser.com/ProductDetail/71-WSL25125L000FEA18/ (5 mOhms, 2W)
https://www.mouser.com/ProductDetail/667-ERJ-M1WSF5M0U/ (5 mOhms, 1W)

No yeet:
https://www.mouser.com/ProductDetail/667-ERJ-M1WSF8M0U/ (8 mOhms, 1W)

If you are desoldering the original shunts and replacing, you want these.
https://www.mouser.com/ProductDetail/71-WSL25123L000FEA18/ (3 mOhms, 1W, will work but 2W is technically the 'reference spec', but only Nvidia and MSI seem to be following it!)

I assume you're soldering right?
Do you have a high quality 65W or better soldering iron? Do you have high quality Kester solder wire?

Do keep in mind that there's no harm in doing 5 mOhms instead of 8 mOhms. You are going to run into MSVDD limits long before you run into the theoretical TDP limits anyway.

As far as the 1206 shunts:
1) the Asus cards do not have them.
2) the Founder's Edition has some strange shunt slightly longer than 1206, but the same width. Me and Dante'afk modded them but saw absolutely no difference on any rail shown in hwinfo64. Maybe something's wrong with the contact or maybe they do something else, or maybe they have nothing to do with the MSVDD limits we are running into, causing a high TDP Normalized#. They "seem" to have contact, but neither me nor @dante`afk have checked with a multimeter for continuity.

3) It is completely unknown if the reference cards, which DO have (three to four) actual 1206 5 mOhm shunts, will show any benefit if you solder stack another 1206 5 mOhm on it. No one has tried.


----------



## Imprezzion

Nah just gonna stack 8mohm if I do it. No yeet on this card. Especially no yeet considering it's on stock air. I just want enough power limit to stop it from throttling too much. 

I got temp adjustable 90w station with an iron and a de-soldering vacuum gun. It ain't "name brand" tho hehe. So I won't call it high quality lol.

I gotta test it out putting the 470uf caps back on my 2080 Ti which let go when I pulled the copper heatsinks off which had way too strong of a thermal tape. If that goes well I might do the shunts on the 3080.


----------



## Falkentyne

Imprezzion said:


> Nah just gonna stack 8mohm if I do it. No yeet on this card. Especially no yeet considering it's on stock air. I just want enough power limit to stop it from throttling too much.
> 
> I got temp adjustable 90w station with an iron and a de-soldering vacuum gun. It ain't "name brand" tho hehe. So I won't call it high quality lol.
> 
> I gotta test it out putting the 470uf caps back on my 2080 Ti which let go when I pulled the copper heatsinks off which had way too strong of a thermal tape. If that goes well I might do the shunts on the 3080.


I mean, don't use a starter 25W soldering iron, that's what I was referring to. As long as you have something like a TS100 iron or better, you're fine.

If you're unsure about your soldering skills, use 3M high temp polymide 92 Kapton tape for safe protection around the shunt. Also a desoldering wick is handy as well. But you seem to have experience already so you probably don't need my advice much.

Just get some good quality Rosin flux, then remember to flux the edges of the original shunt, tin the iron, apply solder to the tip and then hold the iron around the silver part of the original shunt. You will need to hold it there for some seconds after the flux melts, because the solder won't flow until the shunt heats up enough for the solder to transfer from the soldering iron to the shunt. The problem is, the PCB absorbs a TON of heat--a deceptive ton of heat, so it takes time to heat the shunt up enough for the solder to flow. 375C will work pretty well. Once it starts transferring, just move the iron around until you get a nice little oval of solder around the edge of the shunt, then repeat the same process on the other edge. The difficult part here is keeping the shunt in place. Don't be afraid to flux or even re-flux if you have some difficulties.

Once you have your two oval solder joints on top of the original shunt, apply flux (important) again on top of the new solder, then put the new shunt on it, hold it carefully with tweezers, then hold the iron to the side of both shunts, where you put the solder at. Do one side at a time, and only apply a little pressure on the shunt with the tweezers (You just want to keep it in place). You need to get one side fully melted and bonded first. Once you do that, let it cool a bit, then move onto the next side, and then when you do the next side, you won't have to worry about the shunt sliding around anymore...just push it down gently as you bond the solder on that side. You will find the second side is much easier than the first.


----------



## VPII

Mr Ripper said:


> Looking at my results my ~19k Time Spy score which says 2100 max clock 2091 average. I think this is when I locked the frequency.
> I have a 18992 score where I was using a custom curve with a max voltage of 1.031v which says 2115 max / 2054 average - I think this was max 2145 on the custom curve.
> 
> I didn't really play around with the standard bios - Is it generally the case the Suprim bios hinders max clock?


In all honesty I cannot really say if it hinders max clocks. Maybe it is just that you have a lot more power as a limit which you don't have with the Gaming X Trio bios and as such it keeps higher clocks without reaching the limit.

Here is my best run.









I scored 19 185 in Time Spy


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Imprezzion

Falkentyne said:


> I mean, don't use a starter 25W soldering iron, that's what I was referring to. As long as you have something like a TS100 iron or better, you're fine.
> 
> If you're unsure about your soldering skills, use 3M high temp polymide 92 Kapton tape for safe protection around the shunt. Also a desoldering wick is handy as well. But you seem to have experience already so you probably don't need my advice much.
> 
> Just get some good quality Rosin flux, then remember to flux the edges of the original shunt, tin the iron, apply solder to the tip and then hold the iron around the silver part of the original shunt. You will need to hold it there for some seconds after the flux melts, because the solder won't flow until the shunt heats up enough for the solder to transfer from the soldering iron to the shunt. The problem is, the PCB absorbs a TON of heat--a deceptive ton of heat, so it takes time to heat the shunt up enough for the solder to flow. 375C will work pretty well. Once it starts transferring, just move the iron around until you get a nice little oval of solder around the edge of the shunt, then repeat the same process on the other edge. The difficult part here is keeping the shunt in place. Don't be afraid to flux or even re-flux if you have some difficulties.
> 
> Once you have your two oval solder joints on top of the original shunt, apply flux (important) again on top of the new solder, then put the new shunt on it, hold it carefully with tweezers, then hold the iron to the side of both shunts, where you put the solder at. Do one side at a time, and only apply a little pressure on the shunt with the tweezers (You just want to keep it in place). You need to get one side fully melted and bonded first. Once you do that, let it cool a bit, then move onto the next side, and then when you do the next side, you won't have to worry about the shunt sliding around anymore...just push it down gently as you bond the solder on that side. You will find the second side is much easier than the first.


Wow thanks a lot for the detailed write-up. Well, I do have quite some experience just soldering larger components like car wiring harnesses, speaker wiring asf. but not much SMD work as my iron never really had the tip for it but now that I got the station I do. 

I will have to order the flux, solder and tape tho. All I got is some Griffon S39 flux I just use on everything and some generic lead free solder wire.


----------



## SirCanealot

Imprezzion said:


> One issue tho.. if I wanna put on the Arctic backplate then I have to obviously remove the stock one which means the PCI-E 8 pin adapters aren't supported anymore as they are mounted to the backplate.


Did you ever look at doing this? I actually using the the Artic backplate myself, but I've had a lot of issues mounting it on my Palit Gaming Pro 3080. My original clips are bent out of shape and are basically useless at this point, so I've used cable ties, but I just don't think I have enough pressure really :/ (trying to get some replacement clips from Artic)
The other issue is that (at least for my Palit card) the PCB is so crowded that there really aren't many good points to add the clips and get a good, tight fit. 
Finally, a lot of the memory chips aren't really covered by the backplate and on my PCB the mounting holes in the backplate are basically exactly where a lot of the memory chips are  

I've got some better cable ties coming and I'm going to glue some more heatsinks onto it to cover more of the gaps, but I'm not sure how well it's going to work...


----------



## Mr Ripper

VPII said:


> In all honesty I cannot really say if it hinders max clocks. Maybe it is just that you have a lot more power as a limit which you don't have with the Gaming X Trio bios and as such it keeps higher clocks without reaching the limit.
> 
> Here is my best run.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 19 185 in Time Spy
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


So far I'm testing the new card on the standard bios and +1250 vram has been error free on the 12 minute test. I'll do more testing between the 2 bioses and report back (maybe the Suprim bios hindered the ram?). 

I'm running an old X5675 but that shouldn't effect the graphics score I presume. Maybe PCIE 2.0 does a little? I'm in the process of upgrading to a 5800X so I can see if it makes any difference with the same GPU settings.


----------



## BluePaint

Mr Ripper said:


> maybe the Suprim bios hindered the ram?.
> 
> I'm running an old X5675 but that shouldn't effect the graphics score I presume. Maybe PCIE 2.0 does a little? I'm in the process of upgrading to a 5800X so I can see if it makes any difference with the same GPU settings.


VRAM speed is mostly silicon lottery and cooling, not the BIOS

CPU + RAM speed has a relatively large influence even on the Timespy GPU score. Port Royale is more CPU+RAM agnostic.


----------



## Imprezzion

SirCanealot said:


> Did you ever look at doing this? I actually using the the Artic backplate myself, but I've had a lot of issues mounting it on my Palit Gaming Pro 3080. My original clips are bent out of shape and are basically useless at this point, so I've used cable ties, but I just don't think I have enough pressure really :/ (trying to get some replacement clips from Artic)
> The other issue is that (at least for my Palit card) the PCB is so crowded that there really aren't many good points to add the clips and get a good, tight fit.
> Finally, a lot of the memory chips aren't really covered by the backplate and on my PCB the mounting holes in the backplate are basically exactly where a lot of the memory chips are
> 
> I've got some better cable ties coming and I'm going to glue some more heatsinks onto it to cover more of the gaps, but I'm not sure how well it's going to work...


Mine are the same. Most are missing the rubber so I guess I'll have to see if it works with this PCB or if I have to figure out a different mounting system.

Might even end up modifying the backplate.. with a angle grinder.. hehe.


----------



## VPII

Mr Ripper said:


> So far I'm testing the new card on the standard bios and +1250 vram has been error free on the 12 minute test. I'll do more testing between the 2 bioses and report back (maybe the Suprim bios hindered the ram?).
> 
> I'm running an old X5675 but that shouldn't effect the graphics score I presume. Maybe PCIE 2.0 does a little? I'm in the process of upgrading to a 5800X so I can see if it makes any difference with the same GPU settings.


Ahhh you will get a nice bump with the 5800X, my card can also do +1250mhz memory, I think even +1300mhz but I have not tested it as yet. The only reason why the core clock might be lower with the Suprim X bios is the higher power limit keeping the card at the higher clock speeds for longer.


----------



## Imprezzion

I noticed that too with the core clock. It's weird how boost behaves. If I set the boat to just run a core offset with +100 voltage and let the card decide it will boost to like, 2070-2085Mhz @ 1.062-1.081v quite a lot but if I set a custom curve there it will crash weirdly enough.

Oh well, I'm quite positively surprised my mid-range Gigabyte Gaming OC does so well. I just run a custom fan curve that goes to 100% at 60c as it's totally not loud at all even at 100% (gaming BIOS, not the silent one). Usually it sits around 57-59c at 90-95% fanspeed. It's kinda nice to have an air-cooled card that can maintain temps so well at a constant 350-360w load. Only concerning thing is the junction temps. They usually sit at 80-86c even with the core being just under 60c. 

I gotta admit tho, if I can somehow get my hands on a ASUS Strix, EVGA FTW3 or Suprim X 3080 I would change it just for having a 3x8 pin and more power limit to play with.


----------



## SirCanealot

Imprezzion said:


> Mine are the same. Most are missing the rubber so I guess I'll have to see if it works with this PCB or if I have to figure out a different mounting system.
> 
> Might even end up modifying the backplate.. with a angle grinder.. hehe.


Please let me know how you do! Might be useful to compare information...


----------



## tiango

Hey guys. I have a Palit Gaming Pro rtx 3080. I'm looking for ways to improve the memory temp, at least a bit, just for ease of mind anyway. The card is supposed to have thermal pads on the backplate, although I didn't remove it to check it just yet.
I came across this old thread where the user shows how he adds a big ass heat sink on the backplate:








GPU Backplate Mod


I was messing around with my IR thermometer when I discovered that the backplate on my GTX Titans was almost 60 C. That was a lot warmer that I expected and I decided to see if there was anything I could do about it. I have a box of old heatsinks and placed a couple on the backplate with some...




www.overclock.net




I've also seen people on reddit that added cpu heat sink and fan on the backplate.
What do you think about that? I like the idea because it is really cheap and easy to do.
I'm from Argentina, so there is not many things I can get here, but I found a cheap 10 x 20 cms aluminum heatsink, and (more interestingly) this bad boy here:





Coolers y Ventiladores Coolers para PC | MercadoLibre.com.ar


Descubrí los productos más buscados que no te podés perder en Coolers y Ventiladores Coolers para PC ✓ Con Envío Gratis y Rápido ❤ Y Compra Protegida © ¡Lo mejor está por llegar!




articulo.mercadolibre.com.ar




Would it make sense to place that copper heat sink on the backplate, aligned with the gpu core and the memory, with the fan sucking air away of the card?
I mean, if aluminum heat sinks do work, copper is supposed to disipate the heat better and I guess the little fan would help a bit.
Is this a dumb idea? It's just like 9 usd anyways, lol.
I'm thinking of buying that and a thermal pad to place it on the pcb, altough I don't know if it would be better to use some thermal paste or thermal glue.
Thanks!


----------



## chiknnwatrmln

So I was having high memory tjunction temps on my MSI Trio X 3080 (like many other AIB cards). Gaming would see 90c, mining would see 110c + throttling.

I tried what some people online suggested, adding additonal thermal pads under the backplate (the trio has some already) and sticking on some little heat sinks. This got me sitting at 106c during mining - still too hot.

I went a step further, re-pasted the card and put new high performance thermal pads (12.7 w/mk) on the memory. Others online mentioned that this helps some, and can reduce temps to maybe 100c.

However, I think I found the real cause of high tjunction temps on this card! The memory thermal pads sit on a shim, and that shim is screwed to the main cooler block. This shim has almost no thermal paste! I removed it, cleaned it up, applied some thermal grizzly kryonaut, and after re-assembly mining sees my tjunction temp under 86c with 30% fan speed!! Massive improvement.

Setting fan speed to 80% (what it would be at before) yields tjunction temps of under 75c.









Card with the little heat sinks on the back. Doesn't add much cooling.








This is the shim where the memory pads sat on. I applied TIM to the pads earlier, not much help, just messy and ruins the pads.








The shim with a woeful amount of TIM.








The TIM I applied - I was a little too generous (also added some 1mm pads up top). The area on the left isn't flat so I added extra there to maintain some sort of contact.








Before cleanup.








After cleanup and applying new thermal pads.

Now mining happy at 100mh/s


----------



## MrKenzie

I played with the voltage frequency last night and I'm surprised by the results! 

Using the regular core offset way, I could only benchmark at 2220MHz as it wasn't running the full 1.1V.

Now with 1.1V available it will pass every time at up to 2295MHz (averages about 2260). My memory can now run an extra +100 too.

Chilled water loop, iGame Advanced OC with 450W Strix bios.

It might be worth trying for anyone that hasn't bothered yet.


----------



## Falkentyne

MrKenzie said:


> I played with the voltage frequency last night and I'm surprised by the results!
> 
> Using the regular core offset way, I could only benchmark at 2220MHz as it wasn't running the full 1.1V.
> 
> Now with 1.1V available it will pass every time at up to 2295MHz (averages about 2260). My memory can now run an extra +100 too.
> 
> Chilled water loop, iGame Advanced OC with 450W Strix bios.
> 
> It might be worth trying for anyone that hasn't bothered yet.


Post your curve.
I bet you your effective clocks (in hwinfo64) took a massive nose dive.


----------



## MrKenzie

Falkentyne said:


> Post your curve.
> I bet you your effective clocks (in hwinfo64) took a massive nose dive.





Falkentyne said:


> Post your curve.
> I bet you your effective clocks (in hwinfo64) took a massive nose dive.


I'm away from my pc at the moment. As for effective clocks, it holds a rock steady 2250MHz in Shadow Of The Tomb Raider.


----------



## Falkentyne

MrKenzie said:


> I'm away from my pc at the moment. As for effective clocks, it holds a rock steady 2250MHz in Shadow Of The Tomb Raider.


Wrong.
Effective clocks can NOT be fixed. They are PLL clocks. They change nonstop.
The farther effective clocks are from requested clocks, the lower your performance will be.
This is caused by MSVDD not being able to keep up with NVVDD / requested clocks, usually because you hit an internal rail power limit or you messed with the V/F Curve without adjusting MSVDD voltage.


----------



## ssgwright

Falkentyne said:


> Wrong.
> Effective clocks can NOT be fixed. They are PLL clocks. They change nonstop.
> The farther effective clocks are from requested clocks, the lower your performance will be.
> This is caused by MSVDD not being able to keep up with NVVDD / requested clocks, usually because you hit an internal rail power limit or you messed with the V/F Curve without adjusting MSVDD voltage.


ah, this explains alot!!! thanks for the info


----------



## MrKenzie

Falkentyne said:


> Wrong.
> Effective clocks can NOT be fixed. They are PLL clocks. They change nonstop.
> The farther effective clocks are from requested clocks, the lower your performance will be.
> This is caused by MSVDD not being able to keep up with NVVDD / requested clocks, usually because you hit an internal rail power limit or you messed with the V/F Curve without adjusting MSVDD voltage.


So would I need to check before and after performance with benchmarks to see if I'm gaining anything?

The issue I was having was almost never would the core voltage step up to 1.1V when using the core offset method, would that be because I was nearing the power limit?


----------



## ssgwright

it's odd... the most I've seen on mine is 1.1v and thats on chilled water never breaking 37c the volts never go over 1.1 and I'm shunt modded not hitting power limits


----------



## MrKenzie

ssgwright said:


> it's odd... the most I've seen on mine is 1.1v and thats on chilled water never breaking 37c the volts never go over 1.1 and I'm shunt modded not hitting power limits


I'm guessing 1.1V is the maximum unless you have an XOC card, either way I'm happy with 2200MHz+ stable for gaming.


----------



## Imprezzion

Yeah well, my card would be able to do it if it didn't have such a low power limit haha.

I put it up for trade locally for a 3x8 pin model of any brand. Let's see if someone wants to bite.

So far it's performing great for a 2x8 pin card, but as you guys mentioned, effective clock can drop as low as 1950-1980Mhz when it power throttles hard in games like Division 2. It cannot sustain anything above 1.006v at 355-370w. In Cyberpunk probably due to DLSS the load is a bit less so it usually sits around 2070-2100Mhz at 1.062-1.075v which is what I was targeting but yeah. Hardly any game actually runs those clocks because of the power limit.


----------



## rjrusek

Has anyone flashed their Gigabyte Gaming OC with the AORUS Master / Waterforce BIOS?

I am running water and wanted to know if there would be any benefit to running the AORUS Master / Waterforce BIOS on the Gaming OC.

Thanks in advance,
RJR


----------



## AngEv1L

Imprezzion said:


> Yeah well, my card would be able to do it if it didn't have such a low power limit haha.
> 
> I put it up for trade locally for a 3x8 pin model of any brand. Let's see if someone wants to bite.
> 
> So far it's performing great for a 2x8 pin card, but as you guys mentioned, effective clock can drop as low as 1950-1980Mhz when it power throttles hard in games like Division 2. It cannot sustain anything above 1.006v at 355-370w. In Cyberpunk probably due to DLSS the load is a bit less so it usually sits around 2070-2100Mhz at 1.062-1.075v which is what I was targeting but yeah. Hardly any game actually runs those clocks because of the power limit.


You need to do custom curve for game. Mb 0.92-0.95 for 1995-2010 and this is stable for 350-370 watt bios in games


----------



## Imprezzion

rjrusek said:


> Has anyone flashed their Gigabyte Gaming OC with the AORUS Master / Waterforce BIOS?
> 
> I am running water and wanted to know if there would be any benefit to running the AORUS Master / Waterforce BIOS on the Gaming OC.
> 
> Thanks in advance,
> RJR


Haven't tried it yet but I can do it. Mine just runs the Loud BIOS on both BIOS slots but I'll switch to the other slot and see which BIOS works on it.

Probably not going to do a thing as they all have 370w limit but who knows.. hehe.


----------



## Mr Ripper

chiknnwatrmln said:


> So I was having high memory tjunction temps on my MSI Trio X 3080 (like many other AIB cards). Gaming would see 90c, mining would see 110c + throttling.
> 
> I tried what some people online suggested, adding additonal thermal pads under the backplate (the trio has some already) and sticking on some little heat sinks. This got me sitting at 106c during mining - still too hot.
> 
> I went a step further, re-pasted the card and put new high performance thermal pads (12.7 w/mk) on the memory. Others online mentioned that this helps some, and can reduce temps to maybe 100c.
> 
> However, I think I found the real cause of high tjunction temps on this card! The memory thermal pads sit on a shim, and that shim is screwed to the main cooler block. This shim has almost no thermal paste! I removed it, cleaned it up, applied some thermal grizzly kryonaut, and after re-assembly mining sees my tjunction temp under 86c with 30% fan speed!! Massive improvement.
> 
> Setting fan speed to 80% (what it would be at before) yields tjunction temps of under 75c.
> 
> View attachment 2478365
> 
> Card with the little heat sinks on the back. Doesn't add much cooling.
> View attachment 2478366
> 
> This is the shim where the memory pads sat on. I applied TIM to the pads earlier, not much help, just messy and ruins the pads.
> View attachment 2478367
> 
> The shim with a woeful amount of TIM.
> View attachment 2478368
> 
> The TIM I applied - I was a little too generous (also added some 1mm pads up top). The area on the left isn't flat so I added extra there to maintain some sort of contact.
> View attachment 2478369
> 
> Before cleanup.
> View attachment 2478370
> 
> After cleanup and applying new thermal pads.
> 
> Now mining happy at 100mh/s


Thanks for taking the time to write this up. I have the same card and was wondering what the memory the temps are - How do you monitor them?

I had a crash during a game after some hours and according to event viewer nvlddmkm dropped responding and recovered but my screen remained green so I had to reboot. I was wondering if this might have been the memory temps creeping up as I am running it at +1250. Could have been core also so I'll have to isolate one of them.

I'm going to be watercooling the card soon and will be ensuring I do what I can to keep the memory as cool as possible.


----------



## Imprezzion

Green screen is 100% memory related. I get the feeling ECC is nice but doesn't correct all errors and doesn't prevent every single crash at all. I had my card hard lock a benchmark at +1400 several times while +1200 is perfectly fine. 

I'm going to play with the curve a bit and see if can get a bit more stable of a clock out of it in stead of it going all over the place in different games.


----------



## Ziver

Guys what do you think about Gigabyte RTX 3080 Master Rev2.0 . I'm using for games ?


----------



## Imprezzion

Oh boy you guys will never believe me...

I put the Arctic backplate on my card now and it doesn't cover the entire card memory wise but it's always better then nothing like the original backplate which had no thermal interface whatsoever.

But that isn't the special thing.

I also decided to mod the fan connectors as I remembered from water-cooling a 2080 Ti that the fans of the card are included in the total power draw and this can be quite a lot.

I have the PWM and RPM wires in the original plugs and the +12v and ground wires I pulled out of the connectors and are now run through a PWM splitter powered by the motherboard.

This resulted in *40w *less power draw which means I can now run 1.081v in Cyberpunk without it even being close to throttling. It barely touches 330w.

So yeah, just not powering your fans through the card makes a absolutely massive difference.

I also put CLU on the core which means it now runs like 54c load on air lol.

See screenshot below. This is +100mv, +60 core, +1200 memory, in CP2077 1080p all maxed settings RT Psycho DLSS Quality with quite a heavy ReShade on top.

Before this "mod" it would run 1980-2025 core and throttle voltage as low as 0.987v. Now it just sits at 1.081v and stays there.










EDIT: made a curve for 1.043v as that seems to be about the most efficient voltage for it.

It runs 2100-2085 @ 1.043v with +1200 memory just fine so far. Got the same 60c I had on 100% fanspeed now with 73%. Just a shame the Junction is still the exact same temp even with the Arctic backplate. The backplate is incredibly hot btw which means it's doing something very effective as it's consuming a LOT of heat and drawing it away from the card, just not the Junction temps.


----------



## MrKenzie

Imprezzion said:


> Oh boy you guys will never believe me...
> 
> I put the Arctic backplate on my card now and it doesn't cover the entire card memory wise but it's always better then nothing like the original backplate which had no thermal interface whatsoever.
> 
> But that isn't the special thing.
> 
> I also decided to mod the fan connectors as I remembered from water-cooling a 2080 Ti that the fans of the card are included in the total power draw and this can be quite a lot.
> 
> I have the PWM and RPM wires in the original plugs and the +12v and ground wires I pulled out of the connectors and are now run through a PWM splitter powered by the motherboard.
> 
> This resulted in *40w *less power draw which means I can now run 1.081v in Cyberpunk without it even being close to throttling. It barely touches 330w.
> 
> So yeah, just not powering your fans through the card makes a absolutely massive difference.
> 
> I also put CLU on the core which means it now runs like 54c load on air lol.
> 
> See screenshot below. This is +100mv, +60 core, +1200 memory, in CP2077 1080p all maxed settings RT Psycho DLSS Quality with quite a heavy ReShade on top.
> 
> Before this "mod" it would run 1980-2025 core and throttle voltage as low as 0.987v. Now it just sits at 1.081v and stays there.
> 
> View attachment 2478608
> 
> 
> EDIT: made a curve for 1.043v as that seems to be about the most efficient voltage for it.
> 
> It runs 2100-2085 @ 1.043v with +1200 memory just fine so far. Got the same 60c I had on 100% fanspeed now with 73%. Just a shame the Junction is still the exact same temp even with the Arctic backplate. The backplate is incredibly hot btw which means it's doing something very effective as it's consuming a LOT of heat and drawing it away from the card, just not the Junction temps.
> 
> View attachment 2478613


Since Falkentyne alerted me to effective clocks, I have done testing in S.O.T.T.R between stock (370W bios), max OC (450W bios), and undervolted OC (maximum 320W measured) and the results were a bit scary to be honest.

With my Strix bios, stock is 2070MHz on the curve.
My undervolted OC (2070MHz set, 2000MHz effective) performed 3.7% higher than stock with 15-20% less power used.
My maximum OC (2295MHz set, 2170MHz effective) performed 4% higher than the undervolted OC with 37% more power used!
I would suggest playing around to find what best suits your needs. My maximum OC runs great with S.O.T.T.R (2170MHz effective) where it isn't hitting the power limiter, but in Metro Exodus, my effective clocks are 170MHz lower as it is constantly against the 450W power limit, and even undervolting doesn't help much.


----------



## Imprezzion

So far my effective clock is usually the same as the actual clock or only 10-20Mhz lower at worst.

I tested a lot higher clocks now with a combination of a high offset and a limit on the curve to prevent it from boosting too high in game menu's or whatever.

It's set as +135 offset with it limited to 2100 @ 1.043v which means it runs effective 2025-2055Mhz at 0.987-1.006v in Division 2 and 2100-2085Mhz in CP2077.

I'm very surprised to see it running as high as 2055Mhz seemingly stable in a game as heavy as Division 2 at 0.987-1.006v.. I really didn't expect it to do that. Is it "normal" for a 3080 to do 2040Mhz and up at a voltage as low as 0.987-1.006v? 

I wonder if I can get close to your 2170 effective or even above it with this card.. it has potential enough lol.


----------



## chiknnwatrmln

Mr Ripper said:


> Thanks for taking the time to write this up. I have the same card and was wondering what the memory the temps are - How do you monitor them?
> 
> I had a crash during a game after some hours and according to event viewer nvlddmkm dropped responding and recovered but my screen remained green so I had to reboot. I was wondering if this might have been the memory temps creeping up as I am running it at +1250. Could have been core also so I'll have to isolate one of them.
> 
> I'm going to be watercooling the card soon and will be ensuring I do what I can to keep the memory as cool as possible.


Use the latest version of HWInfo64, it has monitoring capabilites for memory tjunction.

Keep in mind we have ECC memory, so higher is not necessarily faster  

Good luck, it was a cheap mod, $15 thermal pads and $5 in thermal paste made a massive difference.


----------



## Mr Ripper

chiknnwatrmln said:


> Use the latest version of HWInfo64, it has monitoring capabilites for memory tjunction.
> 
> Keep in mind we have ECC memory, so higher is not necessarily faster
> 
> Good luck, it was a cheap mod, $15 thermal pads and $5 in thermal paste made a massive difference.


That's great. I didn't think the Trio had an internal memory temperature sensor - I guess they all do then.

+1250 passed the 12 minute Ampere memory test without errors, but I think 1300 or so didn't which means it's likely borderline and prolonged use / increased temperatures is probably bringing in errors. I'm waiting on my water cooler so I might just replace the pads and do the paste in the meanwhile anyway.

I presume I can unplug the LED strip while I'm at it?


----------



## chiknnwatrmln

Mr Ripper said:


> That's great. I didn't think the Trio had an internal memory temperature sensor - I guess they all do then.
> 
> +1250 passed the 12 minute Ampere memory test without errors, but I think 1300 or so didn't which means it's likely borderline and prolonged use / increased temperatures is probably bringing in errors. I'm waiting on my water cooler so I might just replace the pads and do the paste in the meanwhile anyway.
> 
> I presume I can unplug the LED strip while I'm at it?


From what I saw the LEDs are directly attached to the PCB so no way to remove it short of mods.

I completely disabled the card's LEDs with MSI's crappy dragon software then promptly uninstalled it lol


----------



## Imprezzion

I noticed and measured zero difference in reported power draw with my LEDs disconnected on the Gigabyte Gaming OC.

Fans powered by the motherboard did make a huge difference but still it power throttles a lot in some games, less in others.

I played 2 raids on Division 2 now and it it hovering around 1950-2010 @ 0.962-1.000v even with a +105 offset.. It just does not wanna run any higher voltage in Division 2. Even with the 30w extra from the fans being disconnected. It just boosts a bit higher more often.

I did also notice in CP2077 you were right about effective clock vs read out clocks. If I set it to do like 2100 @ 1.081v with a curve effective clock is way lower, like 2040 ish. If I then set just offset +60, which is like 2055-2070Mhz it has the exact same effective clock but with way lower power draw and voltage. 

Same goes for Division 2, curve 2015Mhz 0.987v is effective 1950 ish, offset 60 is 1950-1975Mhz with also 1940-1950 effective just less drops and throttling. I'll just leave it at +60 and be happy there.

The card can do so much more but yeah, power limits a b-word.


----------



## chiknnwatrmln

Imprezzion said:


> I noticed and measured zero difference in reported power draw with my LEDs disconnected on the Gigabyte Gaming OC.
> 
> Fans powered by the motherboard did make a huge difference but still it power throttles a lot in some games, less in others.
> 
> I played 2 raids on Division 2 now and it it hovering around 1950-2010 @ 0.962-1.000v even with a +105 offset.. It just does not wanna run any higher voltage in Division 2. Even with the 30w extra from the fans being disconnected. It just boosts a bit higher more often.
> 
> I did also notice in CP2077 you were right about effective clock vs read out clocks. If I set it to do like 2100 @ 1.081v with a curve effective clock is way lower, like 2040 ish. If I then set just offset +60, which is like 2055-2070Mhz it has the exact same effective clock but with way lower power draw and voltage.
> 
> Same goes for Division 2, curve 2015Mhz 0.987v is effective 1950 ish, offset 60 is 1950-1975Mhz with also 1940-1950 effective just less drops and throttling. I'll just leave it at +60 and be happy there.
> 
> The card can do so much more but yeah, power limits a b-word.


If your card is a 3 pin card you should be able to flash various 450w BIOS's.

I flashed the Suprim BIOS on my gaming X trio and my card will effectively draw up to 410w.


----------



## Imprezzion

chiknnwatrmln said:


> If your card is a 3 pin card you should be able to flash various 450w BIOS's.
> 
> I flashed the Suprim BIOS on my gaming X trio and my card will effectively draw up to 410w.


Nope 2 pin. That's the problem lol. The card will easily run 2160+MHz if I could feed it enough power and voltage..


----------



## MrKenzie

Imprezzion said:


> So far my effective clock is usually the same as the actual clock or only 10-20Mhz lower at worst.
> 
> I tested a lot higher clocks now with a combination of a high offset and a limit on the curve to prevent it from boosting too high in game menu's or whatever.
> 
> It's set as +135 offset with it limited to 2100 @ 1.043v which means it runs effective 2025-2055Mhz at 0.987-1.006v in Division 2 and 2100-2085Mhz in CP2077.
> 
> I'm very surprised to see it running as high as 2055Mhz seemingly stable in a game as heavy as Division 2 at 0.987-1.006v.. I really didn't expect it to do that. Is it "normal" for a 3080 to do 2040Mhz and up at a voltage as low as 0.987-1.006v?
> 
> I wonder if I can get close to your 2170 effective or even above it with this card.. it has potential enough lol.


I think you will be limited by temperature. My card runs at 20c because my water loop goes through an aquarium chiller, chilled to between 10-15c which keeps condensation manageable. Yes these cards drop clocks even at low temperatures of 25c or so, same as my Pascal card did. It's fun going for that maximum frequency number, but when is it no longer worth the horrendous reduction of efficiency?


----------



## chiknnwatrmln

Imprezzion said:


> Nope 2 pin. That's the problem lol. The card will easily run 2160+MHz if I could feed it enough power and voltage..


Ah got it. FWIW you're not missing much, even with 400+ watts the performance gain is minimal


----------



## ducegt

After an Ace Combat 7 session which pegs my GPU, my Trio flashed w/ Suprim +150/+999 430w completed TSE stress test (97.9%) with max VRAM junction of 102C on stock cooler in an inverted ATX case.


----------



## Imprezzion

I've been testing some random other 2x8pin BIOS just to see if there was any difference between them at all in like, power draw or effective clock or whatever.

So far the Gigabyte (REV 1 Aorus Xtreme / Waterforce), ASUS (TUF), Inno3D and Palit/KFA2 BIOS (370-375w) all behave the exact same. No real difference in either effective clock, power limit or throttling behavior.

However. EVGA XC3 Ultra BIOS, despite having a lower power limit of 366w, shows quite impressive results. The graph is much much smoother not dropping as far at all, it actually draws considerably more power then any other 370w BIOS, and effective clocks are much more stable and in line with set clocks.

With the stock Gigabyte 370w OC BIOS it runs around 335-340w in MSI AB / HWINFO64 and it throttles there dropping voltage and clocks. This usually means it reads 95-100% power.

With the EVGA XC3 Ultra 366w BIOS it runs around 350-355w in MSI AB / HWINFO64 and doesn't drop nearly as far as it allows it to go above 100% power. It sits around 100-102% the entire time and effective clocks are much higher and more stable.

I'll do some benching in an hour or so. I saved my best results with the stock BIOS for Time Spy. Let's see if this is actually better.

Oh and I tested the VRAM with that error checker tool, +1200 no errors in the 12 minute and long test. Junction temps ~82c. Let's see how high it can go.

EDIT: So yeah, switching between the EVGA XC3 Ultra BIOS on slot 2 and the stock Gigabyte Gaming OC OC BIOS on slot 1 I see a large difference. Sustained clocks (effective clock) is better on the XC3, it draws about 15-20w more power consistently, benchmark scores in Time Spy at the same set clocks are about 200-250 points higher. So yeah, XC3 Ultra BIOS. Definitely recommend trying that in a random 2x8 pin card!

Still, I find it strange that a BIOS that has 370w/370w and only 100% power slider like the Gigabyte Gaming OC only ever draws 340w and shows "100%" on 340w. It's not a 340w BIOS, it's 370w.. so why show 100% at 340w...I don't get it..


----------



## Ramshot

Imprezzion said:


> I've been testing some random other 2x8pin BIOS just to see if there was any difference between them at all in like, power draw or effective clock or whatever.
> 
> So far the Gigabyte (REV 1 Aorus Xtreme / Waterforce), ASUS (TUF), Inno3D and Palit/KFA2 BIOS (370-375w) all behave the exact same. No real difference in either effective clock, power limit or throttling behavior.
> 
> However. EVGA XC3 Ultra BIOS, despite having a lower power limit of 366w, shows quite impressive results. The graph is much much smoother not dropping as far at all, it actually draws considerably more power then any other 370w BIOS, and effective clocks are much more stable and in line with set clocks.
> 
> With the stock Gigabyte 370w OC BIOS it runs around 335-340w in MSI AB / HWINFO64 and it throttles there dropping voltage and clocks. This usually means it reads 95-100% power.
> 
> With the EVGA XC3 Ultra 366w BIOS it runs around 350-355w in MSI AB / HWINFO64 and doesn't drop nearly as far as it allows it to go above 100% power. It sits around 100-102% the entire time and effective clocks are much higher and more stable.
> 
> I'll do some benching in an hour or so. I saved my best results with the stock BIOS for Time Spy. Let's see if this is actually better.
> 
> Oh and I tested the VRAM with that error checker tool, +1200 no errors in the 12 minute and long test. Junction temps ~82c. Let's see how high it can go.
> 
> EDIT: So yeah, switching between the EVGA XC3 Ultra BIOS on slot 2 and the stock Gigabyte Gaming OC OC BIOS on slot 1 I see a large difference. Sustained clocks (effective clock) is better on the XC3, it draws about 15-20w more power consistently, benchmark scores in Time Spy at the same set clocks are about 200-250 points higher. So yeah, XC3 Ultra BIOS. Definitely recommend trying that in a random 2x8 pin card!
> 
> Still, I find it strange that a BIOS that has 370w/370w and only 100% power slider like the Gigabyte Gaming OC only ever draws 340w and shows "100%" on 340w. It's not a 340w BIOS, it's 370w.. so why show 100% at 340w...I don't get it..



Still, I find it strange that a BIOS that has 370w/370w and only 100% power slider like the Gigabyte Gaming OC only ever draws 340w and shows "100%" on 340w. It's not a 340w BIOS, it's 370w.. so why show 100% at 340w...I don't get it..

yes I was wondering the same thing. Let me know if you find out why.


----------



## Clukos

Finally managed to get an FE at MSRP and put the EK waterblock on it.









I scored 18 605 in Time Spy


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





Almost 20K gpu score, so close










Thermals are looking nice with undervolting, this is after 40 mins of cyberpunk with RT at 4K










Before putting the block the memory would hit 110C and throttle, now it stays at a cool 60-65C


----------



## Imprezzion

What's the average clock during 3DMark? I feel like my card is underperforming a lot compared to that.. I get the same total score, 185xx, but way lower graphics score even at +105 core +1200 memory.. My CPU score is what saves me at 16xxx with a 5.3/4.9 all core OC on the 10900KF.


----------



## Clukos

Imprezzion said:


> What's the average clock during 3DMark? I feel like my card is underperforming a lot compared to that.. I get the same total score, 185xx, but way lower graphics score even at +105 core +1200 memory.. My CPU score is what saves me at 16xxx with a 5.3/4.9 all core OC on the 10900KF.


It hovers around 2.1GHz but can boost all the way up to 2.2GHz, the memory on this card scales all the way up to +1900 without any errors, I assume the water is helping because on air I couldn't go above +1500. For that run I was at +200 on the core and +1900 on mem.


----------



## Imprezzion

Clukos said:


> It hovers around 2.1GHz but can boost all the way up to 2.2GHz, the memory on this card scales all the way up to +1900 without any errors, I assume the water is helping because on air I couldn't go above +1500. For that run I was at +200 on the core and +1900 on mem.


Like, how does it maintain that high of a boost lol. Mine drops all the way to 1890Mhz in some tests due to power limit restrictions.. 
This run is the best I had GPU score wise (CPU was at 5.2 not 5.3 in this run but k).

Max GPU clock is 2100 minimum is 1890. +1200 memory. I get errors past 1300 so I keep it here.

It is just on stock air tho. GPU temps 58-60c with Junction 80-86c. 










I tried many many different BIOS for the card and the EVGA XC3 Ultra is performing the best so far power wise but it still obviously throttles like mad. 

I wanted to flash the Founders BIOS but nVFlash throws a board ID mismatch at me that can't be overwritten lol.. Seems like the FE uses a different board ID compared to partner cards just like the 20xx cards so we'd need a board id mismatch disabled patched nvflash 5.670.0 for 3xxx.


----------



## Clukos

It's possible that I got slightly lucky with silicon quality, not sure if you can flash the FE Bios due to the 12-pin shenanigans. I would like to extend the power limit as well, it's really holding the card back in some cases.


----------



## Imprezzion

Clukos said:


> It's possible that I got slightly lucky with silicon quality, not sure if you can flash the FE Bios due to the 12-pin shenanigans. I would like to extend the power limit as well, it's really holding the card back in some cases.


Yeah and the problem is that I don't wanna shunt mod this thing right away. I'd much rather just trade or sell this and get a random card with a 3x8 pin like a Suprim X, ASUS Strix, EVGA FTW3, even a lesser known grade card like a Palit GameRock, Gainward Phoenix GS or Galax or whatever would do just fine for me but they are completely impossible to get here at the moment. 

I mean, in CP2077 it does fine, 1080p all max RT Psycho DLSS Quality it runs 330-340w without really throttling too much and can sustain 2070-2085Mhz (effective 2050-2070) just fine. In for example Division 2 all maxed DX12 1080p it's quite a different story. It hammers the card way harder and it runs 1980-2010Mhz at best slamming the power limit all the way to 0.975v and at that low of a voltage it just can't run any higher clocks. I can raise the offset for CP2077 as high as +150 for 2130Mhz @ 1.081v but it can't handle the 2040 it then wants to run in TD2 at 0.975v.


----------



## derm

Does anyone know what the T6 screws that go into the leaf spring are? One of mine is almost stripped and I'd like to replace them, but I have no idea what screws to buy, or if they are even sold anywhere.


----------



## leegoocrap

derm said:


> Does anyone know what the T6 screws that go into the leaf spring are? One of mine is almost stripped and I'd like to replace them, but I have no idea what screws to buy, or if they are even sold anywhere.


Usually either m2 or m2.5 screws. Search ebay or aliexpress for something like m2 spring screw or m2 gpu screw 
(you need to figure out what size they are of course)


----------



## Falkentyne

derm said:


> Does anyone know what the T6 screws that go into the leaf spring are? One of mine is almost stripped and I'd like to replace them, but I have no idea what screws to buy, or if they are even sold anywhere.


For what card?


----------



## cstkl1

why do ppl even like this type of game " The Medium" Maxed out @1440p RTX 3080

Gallery URL
imgbox - fast, simple image host


----------



## mouacyk

cstkl1 said:


> why do ppl even like this type of game " The Medium" Maxed out @1440p RTX 3080
> 
> Gallery URL
> imgbox - fast, simple image host


Good question. I'm having a difficult time immersing myself to continue starting Senua. It took multiple tries to finish Ethan Carter, and it turned out to be like a mini story book version of Zork or Myst. It's the same question why some people like screamers like the Blair Witch project. I just don't get some people's obsession with psycho horror thrillers... I'm primarily after the graphics and art experience.


----------



## derm

Falkentyne said:


> For what card?


For the 3080 FE(I think also applies to 3090FE, 3070FE). They aren't traditional spring screws in that they don't have any springs on them, but I still can't find them anywhere. I want to take apart my card again but the thought of stripping a critical screw that cannot be replaced scares me...


----------



## Falkentyne

derm said:


> For the 3080 FE(I think also applies to 3090FE, 3070FE). They aren't traditional spring screws in that they don't have any springs on them, but I still can't find them anywhere. I want to take apart my card again but the thought of stripping a critical screw that cannot be replaced scares me...


The heads seem to be larger than normal, but will Torx m2.5 x 6 screws work? (idk if x8 or x6)






M2.5x6mm Wafer, Pan, Flat Head Metric Machine Screws


Laptop and Metric Machine Screws in M2, M2.5, M3 and up. M3x4 hard drive screws. Check out our SALE!! Don't know what size you need? Don't worry, we can help!



www.laptopscrews.com





What you can also do is take a picture of the original screw and then send them a link to the picture (upload it on imgur or someplace).

Also how do you know it's "almost" stripped?
Torx heads are really difficult to strip because there are so many grooves.
You're far more likely to strip the screwdriver bit, rather than the torx head itself.


----------



## derm

Falkentyne said:


> The heads seem to be larger than normal, but will Torx m2.5 x 6 screws work? (idk if x8 or x6)
> 
> 
> 
> 
> 
> 
> M2.5x6mm Wafer, Pan, Flat Head Metric Machine Screws
> 
> 
> Laptop and Metric Machine Screws in M2, M2.5, M3 and up. M3x4 hard drive screws. Check out our SALE!! Don't know what size you need? Don't worry, we can help!
> 
> 
> 
> www.laptopscrews.com
> 
> 
> 
> 
> 
> What you can also do is take a picture of the original screw and then send them a link to the picture (upload it on imgur or someplace).
> 
> Also how do you know it's "almost" stripped?
> Torx heads are really difficult to strip because there are so many grooves.
> You're far more likely to strip the screwdriver bit, rather than the torx head itself.


First off, thanks for the info. I'll check out those links and see if I can find a replacement.

As for the stripping, you may be right. I didn't notice that my screwdriver was actually slipping, but the socket looked very worn down. I guess its possible I'm being overly cautious, but it certainly didn't look good


----------



## Imprezzion

So, I found out that I can make a full custom curve by adjusting all the points separately. It isn't 100% perfect but it works.

Problem I was having, with for example +120 core it would be stable at 1995-2010Mhz at 0.987v for example but it can't run 2100 at 1.056v on the other end of the scale. And voltage + clocks jump around way too much at higher clocks / voltages due to power limits so it wasn't happy up there.

What I basically did now is make the curve from 0.950v to 1.000v +120, from 1.006 to 1.025 +105, from 1.031 to 1.050 +90, 1.056 to 1.062 +75 and above +60. This creates a weird curve visually but it seems to work just fine. It boosts to the exact points I want it to at all times and seems to be stable under any load I give it now.










This is also visible in the effective clocks which are generally about 40Mhz under what I set. So 1995Mhz is ~1956 effective. 2040Mhz is ~2007 effective where before with just +120 across the curve it would go as high as 2070 effective which it just can't handle at the given voltages.


----------



## mikalcarbine

I have an Aorus 3080 Master Rev 1, I recently replaced the crap thermal pads and added 3mm ones to the backplate which lowered my mem junction temps from constantly throttling at 110C to 84C while mining, during games its even lower in the 70s. While mining I have no problems reaching +1500 on my memory now as long as I keep it cool. Does anyone know how I can extend the memory clock limit past 1500 in afterburner? Aorus Engine has a higher clock limit but does not give any control to the v/f curve. I've locked my core at 0.7v to minimize power consumption while mining.


----------



## geriatricpollywog

Clukos said:


> Finally managed to get an FE at MSRP and put the EK waterblock on it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 605 in Time Spy
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Almost 20K gpu score, so close
> 
> View attachment 2478917
> 
> 
> Thermals are looking nice with undervolting, this is after 40 mins of cyberpunk with RT at 4K
> 
> View attachment 2478918
> 
> 
> Before putting the block the memory would hit 110C and throttle, now it stays at a cool 60-65C


1425mhz average memory frequency is outstanding.


----------



## Colonel_Klinck

mikalcarbine said:


> I have an Aorus 3080 Master Rev 1, I recently replaced the crap thermal pads and added 3mm ones to the backplate which lowered my mem junction temps from constantly throttling at 110C to 84C while mining, during games its even lower in the 70s. While mining I have no problems reaching +1500 on my memory now as long as I keep it cool. Does anyone know how I can extend the memory clock limit past 1500 in afterburner? Aorus Engine has a higher clock limit but does not give any control to the v/f curve. I've locked my core at 0.7v to minimize power consumption while mining.


Yeah it is annoying that AB won't let you push past +1500 on mem. It looks like EVGA Precision X1 allows you to push well past 1500 on mem but no custom curve like AB


----------



## Imprezzion

Colonel_Klinck said:


> Yeah it is annoying that AB won't let you push past +1500 on mem. It looks like EVGA Precision X1 allows you to push well past 1500 on mem but no custom curve like AB


I thought Precision does have curve?


----------



## Professor McNasty

I was able to get my hands on an RTX 3080 FTW 3 from EVGA finally but I'm getting crashes when playing games like GTA V that my 1070 never had. Has anyone else run into this?

Core i9 - 10850k
32GB DDR3 3600MHz CL16 RAM
Gigabyte AORUS Ultra Mobo
Cooler Master V1000 80PLUS Gold PSU


----------



## Hirtle

Colonel_Klinck said:


> Yeah it is annoying that AB won't let you push past +1500 on mem. It looks like EVGA Precision X1 allows you to push well past 1500 on mem but no custom curve like AB


Precision X1 does indeed have a V/F curve.


----------



## ducegt

Professor McNasty said:


> I was able to get my hands on an RTX 3080 FTW 3 from EVGA finally but I'm getting crashes when playing games like GTA V that my 1070 never had. Has anyone else run into this?


Are you using one cable to power two slots by chance?


----------



## MrKenzie

Clukos said:


> It hovers around 2.1GHz but can boost all the way up to 2.2GHz, the memory on this card scales all the way up to +1900 without any errors, I assume the water is helping because on air I couldn't go above +1500. For that run I was at +200 on the core and +1900 on mem.


Your card is definitely a rare beast! The GPU quality and also the memory is above average! My Timespy graphics score was slightly lower even though I have a 450W bios and the card is running at 20c. My memory is crap, crashes at above +650 otherwise I estimate I can achieve over 20,500 graphics score.


----------



## EarlZ

mikalcarbine said:


> I have an Aorus 3080 Master Rev 1, I recently replaced the crap thermal pads and added 3mm ones to the backplate which lowered my mem junction temps from constantly throttling at 110C to 84C while mining, during games its even lower in the 70s. While mining I have no problems reaching +1500 on my memory now as long as I keep it cool. Does anyone know how I can extend the memory clock limit past 1500 in afterburner? Aorus Engine has a higher clock limit but does not give any control to the v/f curve. I've locked my core at 0.7v to minimize power consumption while mining.


Just wanted to get more information on this, 3mm thermal pads at the back of the VRAM and what thickness did you use on the VRAM it self.. 2mm ?
Which brand of thermal pads did you use ?


----------



## lordzed83

Ok Alphacooling Stock TIM and Stock pads VS Gelid 15w + Der8ouers LM









With no messing about Mining temps









24c DOWN on memories 5c Down on core. 









Today New version of HW info landed adding Hotspot temperature. So there This are my temps after 12 hours of mining gpu+cpu Running some port royal's and Playing some WoW.


All in all if ya going WATER I recommend Dropping extra on Pads and LM Makes good difference


----------



## lordzed83

And link to em pads








Gelid Solutions GP-Ultimate - Thermal Pad 90x50x1.0mm. Excellent Heat conduction, Ideal Gap Filler. Easy Installation Thermal Conductivity 15W : Amazon.co.uk: Computers & Accessories


Buy



www.amazon.co.uk


----------



## BluePaint

lordzed83 said:


> Ok Alphacooling Stock TIM and Stock pads VS Gelid 15w + Der8ouers LM


Great to know! I just wanted to ask about pad thickness since I finally received the Alphacool block yesterday.

On the EK waterblock for my 1080Ti (which I only installed a few days ago after 2 years having them in the basement, lol), the provided VRAM pads were 0.5 mm and seemed not thick enough. So I installed 1 mm pads and since GPU temps are great (35C max) I guess 1 mm for VRAM wasn't too thick (otherwise they probably would have impacted contact to GPU core).

It seems that it's always good to check whether provided pads are actually thick enough.


----------



## lordzed83

BluePaint said:


> Great to know! I just wanted to ask about pad thickness since I finally received the Alphacool block yesterday.
> 
> On the EK waterblock for my 1080Ti (which I only installed a few days ago after 2 years having them in the basement, lol), the provided VRAM pads were 0.5 mm and seemed not thick enough. So I installed 1 mm pads and since GPU temps are great (35C max) I guess 1 mm for VRAM wasn't too thick.
> 
> It seems that it's always good to check whether provided pads are actually thick enough.


Ones i listed 1mm ones go on memory that pack can do 2x3080 its spot on to cut sidewise just 1 cut/pad 


there is 1 ******** pad in the kit for. Wher You stack 2 pads on eachother like 5mm square thing looked dodgy AF so added bit tim just in case.


----------



## mouacyk

@lordzed83 Thanks for sharing -- glad to see that moving to more efficient pads does increase memory cooling more. May have something to do with that thick backplate, which the Bykski block also comes with. Got some Fujipoly 17w pads here, but still waiting for when I have time and courage to power mod, so don't have to disassemble multiple times.


----------



## lordzed83

mouacyk said:


> @lordzed83 Thanks for sharing -- glad to see that moving to more efficient pads does increase memory cooling more. May have something to do with that thick backplate, which the Bykski block also comes with. Got some Fujipoly 17w pads here, but still waiting for when I have time and courage to power mod, so don't have to disassemble multiple times.


Thats why i decided to test and report cause asked lfew Youtubers peoples on forums even asked about pads Aquacooling









Emmm id say on GPU YES not on Memory. Mate got Alphacooling Ref block and got around same temps as me other mate got BykSki blocks and around same temps. Ye deffo investing extra in pads is worth it on ampNOThere


----------



## obscurehifi

mikalcarbine said:


> I have an Aorus 3080 Master Rev 1, I recently replaced the crap thermal pads and added 3mm ones to the backplate which lowered my mem junction temps from constantly throttling at 110C to 84C while mining, during games its even lower in the 70s. While mining I have no problems reaching +1500 on my memory now as long as I keep it cool. Does anyone know how I can extend the memory clock limit past 1500 in afterburner? Aorus Engine has a higher clock limit but does not give any control to the v/f curve. I've locked my core at 0.7v to minimize power consumption while mining.


Yes, Auros Engine let's you go past 1500 but there's a DDR conversion to factor in. +3000 in AE equals the same memory clock as +1500 in AB. 

Mark sure you run the memory error tool that's been mentioned in this forum when going high. Errors get corrected and don't really show an issue that's apparent, but it slows things down. Also, higher mem clocks will rob from the overall power limit, so it's a balancing act. 

Sent from my SM-G973U using Tapatalk


----------



## Imprezzion

That tool doesn't work for me at all. Told me there's no errors at +1400 or +1500 for that matter but scores in benchmarks drop hard at 1400 and 1500 straight up crashes..

I'm at +1250 and that is actually solid with no errors in the longest test and still scaling scores in benches / FPS in game.

Still kinda sorta disappointed in the OC potential of my card. It hits a hard wall at 2025Mhz. It's actually quite weird that it doesn't respond to voltage like, at all. 1995-2010 is stable as a rock at 0.987v. 2025Mhz needs like 1.037v to be stable, 2040 needs 1.062, 2055 needs 1.081v and anything above just randomly crashes in games after a few minutes.

Card can bench and run games for short periods as high as 2145Mhz but it crashes rather quickly.

I used CP2077 and World of Tanks (144FPS limited) as a test as those are the only games that don't hit power limit with DLSS enabled / FPS capped. More DLSS in CP2077 is way lower power draw so I used 1080p all max settings RT Psycho DLSS Balanced to not let it hit power limits up to the full 1.100v but it's super unstable above 2055Mhz. If I force it to 2010 @ 0.987 with a curve it just runs for hours just fine. I really expected to get at least 2100 @ 1.081v or something judging by how easy it runs 2010..

I doubt it's temp related as well. It has the stock Gigabyte Gaming OC air cooler with Coollaboratory Liquid Ultra on it, doesn't even touch 60c in games. 57-59c @ ~75% fan speed. Also got the Artic Accelero IV backplate on it with thermal pads which definitely works judging by how unbelievably hot the backplate gets so it's drawing a LOT of heat away from the card. Junction temps are still low to mid 80's tho.


----------



## atand2

I'm having an issue where my 3080FE fans aren't syncing. I set them both to 100% and noticed fan 1 will reach 3800 RPM with fan 2 only maxing out at 3400RPM. Even at 50% there is about 200 RPM difference between the two. Is this normal?

EDIT: I just found a review indicating that these max speeds are correct, but nothing that explains the discrepancy in RPMs when the fans are set to a lower, say 50%, speed.

EDIT #2: I just realized I'm an idiot and that if the max RPM of FAN 2 is lower than FAN 1, then 50% of FAN 2 will clearly be lower than 50% of FAN 1. The takeaway here is that, since these two fans run at different max speeds, having a custom curve that applies the same % pwm to both fans will result in different fan speeds, so you need to find a way to set a fan curve for each individual fan if you want your fan speeds synced. I believe whatever firmware Nvidia uses for the default fan curve already accounts for this and will run FAN 2 at a higher % in order to match speeds with FAN 1, but if you run a custom curve your fans probably will run at different speeds.


----------



## Mr Ripper

Imprezzion said:


> That tool doesn't work for me at all. Told me there's no errors at +1400 or +1500 for that matter but scores in benchmarks drop hard at 1400 and 1500 straight up crashes..


Did you do the NVidia profile inspector change to "CUDA - Force P2 State" to OFF as per the instructions? Otherwise it won't show errors correctly. Also I find the 1initial test to be fairly pointless, other than quickly finding where you get errors, to then work back from that. You need at least the duration of "2check" to get memory up to high temps to get the kind of problems you get in games / benchmarks.


----------



## lordzed83

Mr Ripper said:


> Did you do the NVidia profile inspector change to "CUDA - Force P2 State" to OFF as per the instructions? Otherwise it won't show errors correctly. Also I find the 1initial test to be fairly pointless, other than quickly finding where you get errors, to then work back from that. You need at least the duration of "2check" to get memory up to high temps to get the kind of problems you get in games / benchmarks.


Aka as I'w put in instructions how to use it 

dropping 24c on vram gave me extra +150mhz So now im running +1350 can do 1400 but errors starting to come up and cause of that thos 50mhz gains are in very margin gains



















1350 vs 1400


----------



## Imprezzion

Mr Ripper said:


> Did you do the NVidia profile inspector change to "CUDA - Force P2 State" to OFF as per the instructions? Otherwise it won't show errors correctly. Also I find the 1initial test to be fairly pointless, other than quickly finding where you get errors, to then work back from that. You need at least the duration of "2check" to get memory up to high temps to get the kind of problems you get in games / benchmarks.


Yup did both. Memory gets up to 86c in both testing and gaming. The 12 minute test showed no errors but 3DMark lost 1000 points and was a stuttery mess and games crashed almost immediatly on +1400 even tho it showed no errors in the tool. Weird.


----------



## lordzed83

Imprezzion said:


> Yup did both. Memory gets up to 86c in both testing and gaming. The 12 minute test showed no errors but 3DMark lost 1000 points and was a stuttery mess and games crashed almost immediatly on +1400 even tho it showed no errors in the tool. Weird.


Think there must be something with controller itself on some 3080s as on screenshot i start get errors at 1400 with even lower memory temperature. Still some gains in benchmarks all over.

Also remember this is tuned for 24 threads so depends what cpu u running.


----------



## Krisztias

lordzed83 said:


> Ones i listed 1mm ones go on memory that pack can do 2x3080 its spot on to cut sidewise just 1 cut/pad
> 
> 
> there is 1 ****** pad in the kit for. Wher You stack 2 pads on eachother like 5mm square thing looked dodgy AF so added bit tim just in case.


Where did you put the TIM exactly?


----------



## lordzed83

Krisztias said:


> Where did you put the TIM exactly?


On on top of stacked pads likejust extra bit


----------



## Imprezzion

Ok so, I'm starting to understand how the clocks and curve work on these cards. The real-time clockspeed MSI AB / HWINFO64 tell you means nothing. Literally nothing. Effective clock in HWINFO64 is the only thing that is actually important. 

I can have 2070Mhz give the same effective clock as 1995Mhz just completely depending on power target and temperature and voltage.

So I decided to just ignore the clockspeed MSI AB reports and just focus on Effective clock. It seems my limit for effective clock is around 1950Mhz at 0.956-0.962v with +1250 VRAM for the card to not power throttle in heavy games. This is a visible clockspeed of 2010-2040 in MSI AB so according to MSI AB I'm running 2010-2040Mhz @ 0.950-0.962v. Which I obviously am not. Effective is 1945-1955Mhz. I cannot really get much higher effective clocks under 1.000v but yeah the card also can't run more then 0.962v without power throttling like mad.

It's also funny to see clocks in MSI AB jumping all over the place frantically but Effective Clock is just super calm sitting at 1945-1950 the entire time not reflecting the clocks in AB at all.


----------



## Professor McNasty

ducegt said:


> Are you using one cable to power two slots by chance?


Absolutely not. I have 3 separate 8pin cables from the GPU for the card.


----------



## Micko

Imprezzion said:


> Ok so, I'm starting to understand how the clocks and curve work on these cards. The real-time clockspeed MSI AB / HWINFO64 tell you means nothing. Literally nothing. Effective clock in HWINFO64 is the only thing that is actually important.
> 
> I can have 2070Mhz give the same effective clock as 1995Mhz just completely depending on power target and temperature and voltage.
> 
> So I decided to just ignore the clockspeed MSI AB reports and just focus on Effective clock. It seems my limit for effective clock is around 1950Mhz at 0.956-0.962v with +1250 VRAM for the card to not power throttle in heavy games. This is a visible clockspeed of 2010-2040 in MSI AB so according to MSI AB I'm running 2010-2040Mhz @ 0.950-0.962v. Which I obviously am not. Effective is 1945-1955Mhz. I cannot really get much higher effective clocks under 1.000v but yeah the card also can't run more then 0.962v without power throttling like mad.
> 
> It's also funny to see clocks in MSI AB jumping all over the place frantically but Effective Clock is just super calm sitting at 1945-1950 the entire time not reflecting the clocks in AB at all.



I didn't manage to google anything regarding gpu effective clock except one youtube video where a guy explains a way to compensate for the loss of effective clock by setting up an overclock in Afterburner by locking to single voltage/frequency point.






And it kind of works. When undervolting with an usual curve and testing in Heaven i have 1920 MHz core but around 1885-1890 effective. By doing the locking of frequency/voltage, effective clock was around 1905-1910 during the run and final score was indeed slightly better. 211.5 vs 210 fps. Downside is if you overclock this way, card will be non stop locked to selected frequency and voltage, no downclocking is possible.


----------



## Imprezzion

Yeah that's why I didn't use that. I want my fanless idle lol. Even my CPU still has EIST enabled. (no C-States tho)

I got it to the point with a custom curve where it will sustain 1940-1960Mhz effective clock stable with anywhere from 0.943 to 0.962v under even the heaviest loads without power throttling. It's allowed to go to ~2030 effective at >1.062v but it hardly ever reaches that except in games with DLSS as that somehow takes a whole bunch of load off. CP2077 with DLSS Balanced will run 1.081v most of the time @ ~2040 effective. Any more and it gets unstable and crashes the driver.


----------



## lordzed83

Ill leave it here if anybody thinking about BykSki products  money back faulty block works as night lamp (not joking )


----------



## mikalcarbine

EarlZ said:


> Just wanted to get more information on this, 3mm thermal pads at the back of the VRAM and what thickness did you use on the VRAM it self.. 2mm ?
> Which brand of thermal pads did you use ?


Yes 3mm on the back and 2mm on the VRAM. The quality of the pads gigashyte uses is horrible so I went and replaced everything and barely had enough. I used the gelid pads you can get on Amazon but realized after they are horribly overpriced. I'd recommend you pick up some 120x120 Thermalright sheets here US $6.84 5% OFF|Thermalright Thermal pad 120X120mm 12.8 W/mK 0.5mm 1.0mm 1.5mm 2.0mm High Efficient thermal conductivity Original authentic| | - AliExpress

Here's a photo of the front side and sizes I'd recommend. I originally went 2mm for the caps but it seems like too much and 1mm for the first row which wasn't enough.










If you want more info on the backside I can pull that up too



obscurehifi said:


> Yes, Auros Engine let's you go past 1500 but there's a DDR conversion to factor in. +3000 in AE equals the same memory clock as +1500 in AB.
> 
> Mark sure you run the memory error tool that's been mentioned in this forum when going high. Errors get corrected and don't really show an issue that's apparent, but it slows things down. Also, higher mem clocks will rob from the overall power limit, so it's a balancing act.
> 
> Sent from my SM-G973U using Tapatalk


I tried the Aorus tool and it does go above 1500 but doesn't not let me set the voltage/frequency curve resulting in a core voltage of 0.8 or higher whereas I could run AB locked at 0.7. Hopefully in one of the next betas they'll increase this like they've done in the past.


----------



## mouacyk

^^ said gigashyte lol. IMO, their Eagle/Gaming/Vision cards cooled pretty good on air.



lordzed83 said:


> Ill leave it here if anybody thinking about BykSki products  money back faulty block works as night lamp (not joking )


Would be helpful if guy shared what kind of liquid he used, as that may have quickened the peeling. Even on EK blocks, I've had nickel plating come off with just distilled water and biocide. In any case, a good block should not peel like that unless there are particles in the loop scratching it away over time.


----------



## lordzed83

@mikalcarbine well Amazon ones are 15w Aliexpres are 12w gotta pay more for top stuff


----------



## lordzed83

mouacyk said:


> ^^ said gigashyte lol. IMO, their Eagle/Gaming/Vision cards cooled pretty good on air.
> 
> 
> Would be helpful if guy shared what kind of liquid he used, as that may have quickened the peeling. Even on EK blocks, I've had nickel plating come off with just distilled water and biocide. In any case, a good block should not peel like that unless there are particles in the loop scratching it away over time.


Yup destiled water and ek anti corosion additive


----------



## mikalcarbine

lordzed83 said:


> @mikalcarbine well Amazon ones are 15w Aliexpres are 12w gotta pay more for top stuff


The 15w ultimate only come in 0.5 and 1mm thickness so their application is very limited and won't work on cards that require 2mm for the important stuff like VRAM or 3mm for the backplates. 120x120 sheets of Thermalright or Gelid Extreme make most sense from a cost and application perspective (sizes available for everything). The prices for the 80x40 sheets on Amazon US are pretty high compared to Aliexpress 120x120 sheets which offer over 4 times more pad at a similar cost. They are very hard to come by and the shipping time might be a turnoff for some.


----------



## ducegt

I haven't given it much thought, but I don't understand the importance of the "effective clocks." The curve in MSI Afterburner is dependent on the temperature. Open a heavy application like furmark that runs in a window and watch what happens. I have my Trio flashed with Suprim starting Timespy stress test (from cold boot) at 2190mhz and occasionally gets as low as 2040mhz. Likewise Horizon Zero Dawn will get to 2190 in the menu and in the game as high as 2175, but it'll drop 30mhz once it gets heat soaked. 

My only real concern is my 850w PSU at the top of my case is getting super hot with the power limit maxed out even in HZD where it usually pulls about 340-380w according to Afterburner. I'll be adding a 92mm fan in my case below the PSU to help it out and can later post a picture of my inverted ATX configuration. I've heard many times the Trio is garbage because of it's lesser VRM, but seems the GPU lottery is of far more importance.


----------



## Peter Watson

mikalcarbine said:


> The 15w ultimate only come in 0.5 and 1mm thickness so their application is very limited and won't work on cards that require 2mm for the important stuff like VRAM or 3mm for the backplates. 120x120 sheets of Thermalright or Gelid Extreme make most sense from a cost and application perspective (sizes available for everything). The prices for the 80x40 sheets on Amazon US are pretty high compared to Aliexpress 120x120 sheets which offer over 4 times more pad at a similar cost. They are very hard to come by and the shipping time might be a turnoff for some.


I've just received my water block for my 3080 I plan on using the thermal putty to fill the larger gaps. I'm hoping to hit 2300mhz


----------



## Falkentyne

mikalcarbine said:


> Yes 3mm on the back and 2mm on the VRAM. The quality of the pads gigashyte uses is horrible so I went and replaced everything and barely had enough. I used the gelid pads you can get on Amazon but realized after they are horribly overpriced. I'd recommend you pick up some 120x120 Thermalright sheets here https://m.aliexpress.com/item/4000134361013.html?trace=wwwdetail2mobilesitedetail
> 
> Here's a photo of the front side and sizes I'd recommend. I originally went 2mm for the caps but it seems like too much and 1mm for the first row which wasn't enough.
> 
> View attachment 2479478
> 
> 
> If you want more info on the backside I can pull that up too
> 
> 
> 
> I tried the Aorus tool and it does go above 1500 but doesn't not let me set the voltage/frequency curve resulting in a core voltage of 0.8 or higher whereas I could run AB locked at 0.7. Hopefully in one of the next betas they'll increase this like they've done in the past.


You linked the mobile site btw. It looks TERRIBLE on PC.


----------



## mikalcarbine

Falkentyne said:


> You linked the mobile site btw. It looks TERRIBLE on PC.


I had copied that from a text on my phone, the mobile site looked even worse on my phone when I tried to open it 

I fixed the link on that post, here it is again US $6.84 5% OFF|Thermalright Thermal pad 120X120mm 12.8 W/mK 0.5mm 1.0mm 1.5mm 2.0mm High Efficient thermal conductivity Original authentic| | - AliExpress

I just placed an order, the estimated delivery of April 22 worries me I already did most of my card so these will be for my fiancé's 3070 or to play around with/optimize. I have 2 NF-A12x25 coming today, going to see if deshrouding can give me any benefits


----------



## Falkentyne

mikalcarbine said:


> I had copied that from a text on my phone, the mobile site looked even worse on my phone when I tried to open it
> 
> I fixed the link on that post, here it is again US $6.84 5% OFF|Thermalright Thermal pad 120X120mm 12.8 W/mK 0.5mm 1.0mm 1.5mm 2.0mm High Efficient thermal conductivity Original authentic| | - AliExpress
> 
> I just placed an order, the estimated delivery of April 22 worries me I already did most of my card so these will be for my fiancé's 3070 or to play around with/optimize. I have 2 NF-A12x25 coming today, going to see if deshrouding can give me any benefits


Since everyone in the world seemed to suddenly panic buy all of the good thermal pads everywhere (except the horribly overpriced 17 w/mk Fujipoly and the Alphacool 100 * 100mm $45-$100+ pads), at this point, I would just buy a large collection of 120mm * 120mm pads from China and have them saved up for the future. Like two 0.5mm, two 1mm, two 1.5mm and two 2mm (I don't think the Gelid's are available in 2mm though).









15.51US $ |Gelid Tp-gp02 120x120x0.5mm 1.0mm 1.5mm Graphics Processor Cooling Radiator Conductive Silicone Pad Thermal Pad High Quality - Pc Components Cooling & Tools - AliExpress


Smarter Shopping, Better Living! Aliexpress.com




www.aliexpress.com












8.65US $ 6% OFF|Thermalright Thermal Pad 120x120mm 12.8 W/mk 0.5mm 1.0mm 1.5mm 2.0mm High Efficient Thermal Conductivity Original Authentic - Pc Components Cooling & Tools - AliExpress


Smarter Shopping, Better Living! Aliexpress.com




www.aliexpress.com












7.48US $ 32% OFF|Thermalright Odyssey Thermal Pad On-conductive Silicone Grease Pad 12.8w/mk For Gpu/ram/motherborad/ssd 120x120mm Original Pad - Pc Components Cooling & Tools - AliExpress


Smarter Shopping, Better Living! Aliexpress.com




www.aliexpress.com





Take your pick.
Then just wait for the slow boat.

Meanwhile, to get you by right now:
Since these are not available (Arctic clones 6 w/mk): https://www.amazon.com/gp/product/B086W11JKM
Try these.
Quickly.






Amazon.com: 145mm x 145mm x 1.5 mm Thermal Pad, Thermal Compound for Coolers PS4 CPU GPU Heatsink IC Chipset Northbridge, Easy to Apply: Computers & Accessories


Buy 145mm x 145mm x 1.5 mm Thermal Pad, Thermal Compound for Coolers PS4 CPU GPU Heatsink IC Chipset Northbridge, Easy to Apply: Heatsinks - Amazon.com ✓ FREE DELIVERY possible on eligible purchases



www.amazon.com












Amazon.com: 200mm x 200mm x 1.5 mm Thermal Pad, Thermal Compound for Coolers PS4 CPU GPU Heatsink IC Chipset Northbridge, Easy to Apply : Electronics


Buy 200mm x 200mm x 1.5 mm Thermal Pad, Thermal Compound for Coolers PS4 CPU GPU Heatsink IC Chipset Northbridge, Easy to Apply: Heatsinks - Amazon.com ✓ FREE DELIVERY possible on eligible purchases



www.amazon.com





(I don't get it. 200mm * 200mm is the same price as 145mm * 145mm. You know what to do.......)

I suggest you get them before those go out of stock also.
Then you can use those while you wait for better pads to get in stock. They won't be worse than the original ones!


----------



## BluePaint

I am pretty happy putting the Trio under alphacool block. Now my radiator needs some fresh and cool morning air 










Used the same Gelid pads for VRAM, lordz83 used. Paste is Kryonaut.
I also added 1mm pads on top on the 2mm pads provided by alphacool on the back of the card because it seemed weird to me that the back of the GPU core got 3mm pads and the back of the VRAMs only 2mm even though the GPU core back has more electronics sticking out higher.


Finally over 13000 points in PR (best result before water was 12770 with chilled air):
Benching with water is so much easier due to temps being much more steady. With chilled air, temps could change by 20C from start to end. On water by 1 or 2.
https: // www.3dmark.com/3dm/58588790


----------



## ssgwright

BluePaint said:


> I am pretty happy putting the Trio under alphacool block. Now my radiator needs some fresh and cool morning air
> 
> View attachment 2479595
> 
> 
> Used the same Gelid pads for VRAM, lordz83 used. Paste is Kryonaut.
> I also added 1mm pads on top on the 2mm pads provided by alphacool on the back of the card because it seemed weird to me that the back of the GPU core got 3mm pads and the back of the VRAMs only 2mm even though the GPU core back has more electronics sticking out higher.
> 
> 
> Finally over 13000 points in PR (best result before water was 12770 with chilled air):
> Ok, that thing is crazy. Benching since 30 minutes and it's still going up (now up to 2265Mhz)
> https: // www.3dmark.com/3dm/58588790
> 
> View attachment 2479603


Nice score!!!


----------



## BluePaint

ssgwright said:


> Nice score!!!


Thanks. Best PR with AMD system so far.
Will be interesting to compare PR with my 7700k. In TS, 1080Ti had more GPU points with 7700k than 5800X even though 5800x has 3800 RAM and 7700 3200er


Update:
Did TS run on 5800x and 7700k system.
Almost 300 GPU points more in TS GPU than with 5800x, despite 10 Mhz less clock on average (due to higher air temps at midday compared to morning) and 3800CL14 RAM vs 3200CL14 on Intel System.

TS GPU loves Intel
https: // www.3dmark.com/3dm/58596259?


----------



## Peter Watson

I can't wait to put my water block on today, I will hopefully break the 13000 in pr. Going to use liquid metal on gpu, thermal pads on mem, and tg-pp-10 putty on everything else, I'm hoping I can finally control temps on this 450w card.


----------



## MrKenzie

Peter Watson said:


> I've just received my water block for my 3080 I plan on using the thermal putty to fill the larger gaps. I'm hoping to hit 2300mhz


Good luck, maximum I have managed is 2295MHz but it's unstable for gaming. The effective clocks are harder to keep high, maybe aim to keep effective clocks above 2200MHz

Memory overclock seems just as important as core overclock, if your memory is a bad overclocker (less than +800) then forget about matching what others are achieving.


----------



## Peter Watson

MrKenzie said:


> Good luck, maximum I have managed is 2295MHz but it's unstable for gaming. The effective clocks are harder to keep high, maybe aim to keep effective clocks above 2200MHz
> 
> Memory overclock seems just as important as core overclock, if your memory is a bad overclocker (less than +800) then forget about matching what others are achieving.


I'm just starting it now lol I just want that 13000 Port royal score... Ohh man I'm scoring less lol, have drivers changed or something? I'm quite a bit off my old score, before water cooling ..I seem to be hitting power limit at 360w on 450w bios.









Result







www.3dmark.com


----------



## ducegt

I damn near broke 13k on air with normal ambient temperatures. 12,934 with 430W BIOS. 

My case doesn't have a window so it's function over form all the way. The 140s on front and bottom are intake while the other 3 (PSU included) are outtake.


----------



## ssgwright

Peter Watson said:


> I'm just starting it now lol I just want that 13000 Port royal score... Ohh man I'm scoring less lol, have drivers changed or something? I'm quite a bit off my old score, before water cooling ..I seem to be hitting power limit at 360w on 450w bios.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


same thing happened to me, went from 2190mhz to over 2250 and my score dropped, no idea what the problem is


----------



## Peter Watson

ssgwright said:


> same thing happened to me, went from 2190mhz to over 2250 and my score dropped, no idea what the problem is


Yeah really strange I've not benchmarked the card for ages but I truly thought I was going get that 13000 in port royal once I could hit 2295 clocks..


----------



## BluePaint

Peter Watson said:


> Yeah really strange I've not benchmarked the card for ages but I truly thought I was going get that 13000 in port royal once I could hit 2295 clocks..


Thats weird. Same driver I used and on fast Intel which also gives some points in PR. Something must be off in system configuration. Maybe nvidia control panel?


----------



## Peter Watson

BluePaint said:


> Thats weird. Same driver I used and on fast Intel which also gives some points in PR. Something must be off in system configuration. Maybe nvidia control panel?


To be honest my system is quite old now, my psu is over 8 years old its probably on it's way out I checked the molex connectors going to the gfx card and when I wobbled no1 red light came on on gfx card so there must be a bad connection. Sometimes I hit power limit and others times its fine its so strange I'm thinking its this iffy modular psu cable.


----------



## ducegt

PR 13,064









I scored 13 064 in Port Royal


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com


----------



## ssgwright

ducegt said:


> PR 13,064
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 13 064 in Port Royal
> 
> 
> Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


which 3080 you running?


----------



## BluePaint

ducegt said:


> PR 13,064
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 13 064 in Port Royal
> 
> 
> Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


Thats surely a pretty good chip, running those frequencies at 50C!
Would be worth putting under water


----------



## Imprezzion

Are the mounting holes on a EVGA FTW3 the same as all the other 3080's?

I'm thinking of getting a EVGA Hybrid kit for my 3080 which is most definitely not a EVGA card lol. I tried to put my old Kraken G12 on it but it does not have the right mounting holes.

If all 3080's have standard mounting hole spacing I should be able to put that EVGA Hybrid on any card I have and/or get / trade.

EDIT: I found 2 PCB shots of both cards on google and compared the mounting holes and the general PCB layout around the die, it's basically identical. Different PCB shape and width but the area around the die including the memory layout (and partly VRM as well) is identical between the FTW3 and the Gigabyte Gaming OC. See below.

FTW3 Ultra:









Gigabyte Gaming OC:









So yeah, I'm pretty sure at least the waterblock from a EVGA Hybrid 3080/3090 cooler will fit and maybe even the memory plate. VRM will need a copper heatsink + tape treatment just like my 2080 Ti but yeah.

It's going to take 2-3 weeks for the Hybrid to ship here from the US and I have to buy it through eBay as EVGA EU shop does not list the Hybrid cooler and the US shop doesn't ship outside US except through eBay. Will cost me about $150 total.


----------



## Peter Watson

BluePaint said:


> Thats weird. Same driver I used and on fast Intel which also gives some points in PR. Something must be off in system configuration. Maybe nvidia control panel?


I've come to the conclusion that my Evga rtx 3080 ftw3 has degraded over time due to heat, I did notice the rear of the card where the caps are they had turned a little brown with heat, Checked my cpu mem all seems fine, just my gpu has lower scores. Ohh well


----------



## EarlZ

I am looking at adding some heatsinks on the back plate of my 3080 after I replace the stock thermal pads and add a 3mm pad behind the GDDR6X, the VRM also comes with a stock thermal pad at the back ( wont be replacing them ), I've got dome dated 3M double sided thermal tapes (black) and I have no idea whats the heat transfer rating of this thing, Should I just get an Arctic cooling or thermal right pads for the back plate, there wont be any pressure to press them down and I am wondering if that would still work out.


----------



## BluePaint

Managed to squeeze out some more points on Port Royale with Intel System and some curve tweaking.
About 140 Points more with same curve on [email protected] (3200ram) vs [email protected] (3800ram).
20C GPU temp with 6C water. 
Update: 
Number 10 on leaderboard


----------



## Imprezzion

Just my luck. I noticed my card getting considerably hotter then normal so I opened the case and yeah of course.. the back fan is dead.. Not going to be able to RMA it easily as there are no cards available and mine is bought through Amazon from Italy and I'm in The Netherlands..

So yeah, I did some dumb stuff. Just like the good ol' days I went full freaking YOLO on the poor thing and zip tied some 140mm's to the cooler after removing the entire fan assembly and made a custom plug for the fans meaning I can control them through the PWM and RPM wires of the card but the fans power comes from the motherboard through a splitter and a normal header.

Poor poor thing.. full ghetto.. funny thing is temps dropped over 5c, no noise, hotspot temp dropped 10c, junction dropped 15c..

Sometimes I'm ashamed of myself how little I care about my hardware lol.

I even managed to remove the Gigabyte RGB logo and taped it to the front of the fans lool.

So yeah, this basically means I have to either run it like this or get a Bykski waterblock a little sooner then I anticipated as I don't actually have the loop done for it yet but k.


----------



## MrKenzie

BluePaint said:


> Managed to squeeze out some more points on Port Royale with Intel System and some curve tweaking.
> About 140 Points more with same curve on [email protected] (3200ram) vs [email protected] (3800ram).
> 20C GPU temp with 6C water.
> Update:
> Number 10 on leaderboard
> View attachment 2479881


Nice result! Almost 500 points more than my best with the same cpu. I just bench mine with cpu at 5.0 which is my 100% stable clock, and don't change anything in windows, so I think I could manage 13,200 or something but I can't be bothered chasing that last bit. Weren't the LN2 scores with 3080 only 14,000?


BluePaint said:


> Managed to squeeze out some more points on Port Royale with Intel System and some curve tweaking.
> About 140 Points more with same curve on [email protected] (3200ram) vs [email protected] (3800ram).
> 20C GPU temp with 6C water.
> Update:
> Number 10 on leaderboard
> View attachment 2479881


Nice result! Almost 500 points more than my best with the same cpu. I only bench mine with cpu at 5.0 which is my 100% stable clock, and don't change anything in windows, so I think I could manage 13,200 or something but I can't be bothered chasing that last bit. The LN2 scores with 3080 were only 14,000 so your score is nuts.


----------



## Imprezzion

Here's the results of my unplanned 140mm fan mod lol. Running DIvision 2 with a +120 / +1250 overclock and letting the power limit decide what it gets. Full on 340-345w constant load. It is running an Accelero IV backplate as well with thermal pads for VRM and VRAM and using Coollaboratory Liquid Ultra (metal) "paste" on the die. Look at that perfectly flat line temp wise lol! I'm also nowhere near max RPM for the 140MM's as they can do 1800 RPM when it needs to.










Pics of the card:


















Ghetto AF


----------



## BluePaint

Ghetto FTW!
I was using 2 Noctua 140mm 3000RPM industrial fans with manual Noctua fan control on my Trio before water. Great temps for air and also quiet when needed, although the MSI fans were already pretty good.



MrKenzie said:


> Nice result! Almost 500 points more than my best with the same cpu. I only bench mine with cpu at 5.0 which is my 100% stable clock, and don't change anything in windows, so I think I could manage 13,200 or something but I can't be bothered chasing that last bit. The LN2 scores with 3080 were only 14,000 so your score is nuts.


CPU frequency doesn't seem to matter much in PR. Tested 5.2 but didn't give any more points than 5.1. Last test was 5.15 because I raised BCLK to 101 Mhz. Not sure how much difference (if any) that made becaues I also adjusted GPU curve at the same time.

One thing which seem to have made a difference of 200 points was enabling HAGS (hardware accelerated GPU scheduling) in Windows. I disabled it temporarily and got 200 points less. In some 3dmark benches it can impact the result negatively I think but in PR it seems to help.

That score was only possible on my card with water temps < 10C. Wish I have had my water block a week earlier when we had -10C air. GPU temps of 10C would have been possible as I did with my freshly installed water block on my 1080Ti which ran @ 2240 Mhz @ 10C with XOC BIOS and scored the highest 1080Ti PR score so far. Not that PR would matter in any way for a 1080Ti with under 13fps, lol.


----------



## ducegt

ssgwright said:


> which 3080 you running?


Gaming X Trio flashed with the newest Suprim BIOS on TPU.



BluePaint said:


> Thats surely a pretty good chip, running those frequencies at 50C!
> Would be worth putting under water


I also think it's worthy of water, but I don't think I'll bother in large part because my 9900K is a total dud. Can't do 5ghz stable at any voltage. The 7700K it replaced could finish TimeSpy (albeit the older version of TS) at 5.4ghz @ 1.5....that probably was also with the help of winter air. Prior to this 3080, my Vega 64 LC couldn't overclock a single mhz on the core until I repasted and managed to grab the #1 spot for 7700k w/ Vega 64 on ambient temperature to boot. I haven't toyed with the GPU scheduler FWIW.

Is it not possible to hardmod over volt Ampere? Hard to find any info with endless google pages related to undervolting and shunt modding. Would think a 3080 on chilled water would really shine with like 1.25v on the core.


----------



## DaftConspiracy

ducegt said:


> Gaming X Trio flashed with the newest Suprim BIOS on TPU.
> 
> 
> 
> I also think it's worthy of water, but I don't think I'll bother in large part because my 9900K is a total dud. Can't do 5ghz stable at any voltage. The 7700K it replaced could finish TimeSpy (albeit the older version of TS) at 5.4ghz @ 1.5....that probably was also with the help of winter air. Prior to this 3080, my Vega 64 LC couldn't overclock a single mhz on the core until I repasted and managed to grab the #1 spot for 7700k w/ Vega 64 on ambient temperature to boot. I haven't toyed with the GPU scheduler FWIW.
> 
> Is it not possible to hardmod over volt Ampere? Hard to find any info with endless google pages related to undervolting and shunt modding. Would think a 3080 on chilled water would really shine with like 1.25v on the core.


Der8er has a video on shunting and overvolting a 3090 tuf

Sent from my IN2025 using Tapatalk


----------



## DaftConspiracy

BluePaint said:


> Managed to squeeze out some more points on Port Royale with Intel System and some curve tweaking.
> About 140 Points more with same curve on [email protected] (3200ram) vs [email protected] (3800ram).
> 20C GPU temp with 6C water.
> Update:
> Number 10 on leaderboard
> View attachment 2479881


I really don't understand why Port Royal likes Intel so much

Sent from my IN2025 using Tapatalk


----------



## Imprezzion

Man these cards are a handful with the 2x8 pin cards limiting so hard lol.

If I just use a core clock offset I noticed the first part of the curve up to like the 1.006v part can handle as high as +165 fine but as soon as I play a lighter game that lets it stretch it's legs like World of Tanks the top part of the curve at 1.031v and up can't get nowhere close to +165.. At like 1.081v+ the card can barely handle +60 so it seems to hit a frequency / voltage wall at a rather weird point.

I have to put some work into making a full custom curve tuning every single voltage point manually which is going to take forever..

It's so weird when looking at effective clocks as well. It seems to do like ~1970-1980Mhz effective around 0.987-1.000v fine but even a small increase to say, 2030Mhz effective is nowhere near stable even at 1.062v. It really looks like a sort of voltage wall that CPU's seem to have above certain frequencies.

Maybe I should just hard limit the curve to say, 0.800v up to 1.006v @ +165 and lock all points above that to the same clockspeed or maybe 1-2 bins above it from 1.050v and up or something.

Real shame, this card has potential bench wise for the core clock but it just doesn't wanna run games stable at anything above 2055Mhz ish (effective ~ 2030) even tho it will pass lighter benchmarks that don't hit power limit as high as 2145Mhz (2115 effective).

It 100% ain't temperature related. Core doesn't exceed 56c, hotspot 65c, junction 82c so it's not like I'm pushing 70c+ or anything. And even if I do, I set the fans to 500 RPM up to 85c, it will still maintain the same effective clocks without getting unstable.


----------



## MrKenzie

BluePaint said:


> CPU frequency doesn't seem to matter much in PR. Tested 5.2 but didn't give any more points than 5.1. Last test was 5.15 because I raised BCLK to 101 Mhz. Not sure how much difference (if any) that made becaues I also adjusted GPU curve at the same time.
> 
> One thing which seem to have made a difference of 200 points was enabling HAGS (hardware accelerated GPU scheduling) in Windows. I disabled it temporarily and got 200 points less. In some 3dmark benches it can impact the result negatively I think but in PR it seems to help.
> 
> That score was only possible on my card with water temps < 10C. Wish I have had my water block a week earlier when we had -10C air. GPU temps of 10C would have been possible as I did with my freshly installed water block on my 1080Ti which ran @ 2240 Mhz @ 10C with XOC BIOS and scored the highest 1080Ti PR score so far. Not that PR would matter in any way for a 1080Ti with under 13fps, lol.


So you just have cold ambients that allow you to get such low coolant temperatures? I have a chiller hooked up to mine, so I can just set it to 5c and benchmark until I get too much condensation. I can actually go down to 0c which I might try one day.


----------



## BluePaint

MrKenzie said:


> So you just have cold ambients that allow you to get such low coolant temperatures?


Yes, the whole open setup is on the window sill and for benching i open the window. That all the components are close to cold outside air helps with condensation since in winter outside air is usually low in humidity. When I only had distilled water in the loop there was an occasion where ice started to form in the reservoir, lol. Now I have like 25% aquacomputer DP Ultra which is good down to -10C.


----------



## Shoryuken

I have a newly bought MSI 3080 gaming trio X. The card can only draw 206 to 225 watts maximum. Ive tried every setting possible on afterburner, flashed bios to suprim X and also physically changed 3 different PSUs and used 16AWG cables. The PL on afterburner cant go over 100%. What I see on powertechgpu is that the 1st 8pin draws 150watt and the other 2 8pins draw 20-25 watt each. Shouldnt the power juice be divided among each other?


----------



## mouacyk

Shoryuken said:


> I have a newly bought MSI 3080 gaming trio X. The card can only draw 206 to 225 watts maximum. Ive tried every setting possible on afterburner, flashed bios to suprim X and also physically changed 3 different PSUs and used 16AWG cables. The PL on afterburner cant go over 100%. What I see on powertechgpu is that the 1st 8pin draws 150watt and the other 2 8pins draw 20-25 watt each. Shouldnt the power juice be divided among each other?


What PSU and are you using 1 rail per pin, so 3 cables?


----------



## Shoryuken

Ι have used 2 corsair 750w and 1 server psu hp 1200w. I have used 3 separate cables, 6 to 6+2 so yes 1 rail per pin.


----------



## Imprezzion

Shoryuken said:


> Ι have used 2 corsair 750w and 1 server psu hp 1200w. I have used 3 separate cables, 6 to 6+2 so yes 1 rail per pin.


What does the newest BETA from HWiNFO64 tell you? Same thing with the power draw? Also, what is the perfcap reason in GPU-Z? PWR, VRel, something else?

I got my curve adjusted to this now: 









Basically +120 with +105 at the later part of the curve and limited to max 2055Mhz @ anything over 1.031v. If I let it go to the full +105/+120 above 1.031v it will become unstable. 
This works great so far. Effective clocks are much closer to actual clocks then before and the card now runs an average of 1960-1980Mhz effective at anywhere from 0.950 to 0.987v and in lighter loads it will do 2000-2020Mhz effective at 1.000-1.031v.


----------



## Shoryuken




----------



## Imprezzion

Shoryuken said:


> View attachment 2480104
> View attachment 2480105
> View attachment 2480106


So, GPU-Z and HWINFO64 agree that it's PWR limited.

Also HWINFO64 agrees that only 1 8 pin is pulling normal power together with the PCI-E slot and the other 2 are doing nothing above idle really. That looks a lot like a defect in the shunts measuring power or something. This does not look like a software issue as it does show 56.xx power limit so it knows the correct percentages and values. 

You did do a full DDU driver reinstall just to be sure right?


----------



## Shoryuken

Imprezzion said:


> You did do a full DDU driver reinstall just to be sure right?


Yup.. Another strange thing. On gpuz it says that my.memory type is gddr5 Samsung and the bus width 128bit.


----------



## acoustic

Card is defective I'd say.


----------



## mouacyk

Update your drivers to 460 series. Your memory is close to throttling too.


----------



## Shoryuken

Has anyone seen before 3080 card with gddr5 memory and 128bit bus? This is what gpuz claims the card is. The card is bought from a scalper though


----------



## acoustic

I think the card is defective. There's no such thing as a 3080 with GDDR5.


----------



## Imprezzion

Memory clock speed is correct for GDDR6X tho.. weird..


----------



## Shoryuken

I have contacted msi


----------



## mouacyk

Anyone buzzed about the ReBAR support coming tomorrow? All you benchers can use another 1% performance.


----------



## BluePaint

mouacyk said:


> Anyone buzzed about the ReBAR support coming tomorrow? All you benchers can use another 1% performance.


NVIDIA will probably use the opportunity of new BIOS to nerf mining for all Ampere cards, lol


----------



## mouacyk

BluePaint said:


> NVIDIA will probably use the opportunity of new BIOS to nerf mining for all Ampere cards, lol


Works for me. I just need more benchmark points. And unlocked PL, please NVidia.


----------



## BluePaint

Yeah, unlocked PL is sth to dream of.
I am currently using my 1080Ti FE under water because its so much fun with XOC bios and no PL 

Had the opportunity to order a 3080FE, which should arrive tomorrow. Maybe my 1st shunt mod if the core is any good. Too bad there is no unlimited bios for FE like for 1080Ti.


----------



## Imprezzion

Even if the BIOS is unlimited it will still power throttle if it isn't shunted right? I did order a bunch of resistors from Mouser but they still haven't showed up...


----------



## mouacyk

Imprezzion said:


> Even if the BIOS is unlimited it will still power throttle if it isn't shunted right? I did order a bunch of resistors from Mouser but they still haven't showed up...


We know it's possible to bypass hardware limits completely with just BIOS flash on Pascal. So far, it seems to be a mixed bag on Ampere, where even the Kingpin 1000W BIOS seems to not have consistent PL unlocks. Who knows, there may be a hard lock on Ampere.


----------



## mouacyk

Lameo -- ReBAR support for other 3000 GPUs not available until late March. FILO is super lame.


----------



## BluePaint

Hmmm, that new FE chip seems to be even slightly better than the one from my MSI card.
With my MSI card, I needed 26C for 2325Mhz. Not sure what to do with it exactly yet, lol.

























Update: can get it briefly to run @ 2370 @ 23C. Lower temps not really possible atm (8C air temps)


----------



## Nizzen

BluePaint said:


> Hmmm, that new FE chip seems to be even slightly better than the one from my MSI card.
> With my MSI card, I needed 26C for 2325Mhz. Not sure what to do with it exactly yet, lol.
> View attachment 2480264


If it can't do 2300mhz in Port royal, -> garbage 😅


----------



## BluePaint

Nizzen said:


> If it can't do 2300mhz in Port royal, -> garbage 😅


Haha, true. My MSI card did 2280 @ 20C in PR, so maybe there is a chance this one could manage it.

I only found the EK block for 270Euro available for FE. Waited 2 months for my alphacool MSI block. Really annoying.


----------



## Rik_IV

Gaming X trio user here running +140 clock and +1000 on memory, default BIOS. I get around 12241-12400 on port royal. Do you guys think it's worth flashing( and voiding warranty) to the newest Suprim X BIOS for gaming?

According to HWiNFO my card has pretty much a "yes" on "Performance Limit - Power" on every benchmark/mining but not in gaming because of possible CPU bottleneck on 1080p.


----------



## ducegt

Rik_IV said:


> Gaming X trio user here running +140 clock and +1000 on memory, default BIOS. I get around 12241-12400 on port royal. Do you guys think it's worth flashing( and voiding warranty) to the newest Suprim X BIOS for gaming?
> 
> According to HWiNFO my card has pretty much a "yes" on "Performance Limit - Power" on every benchmark/mining but not in gaming because of possible CPU bottleneck on 1080p.


You can always flash back to resume the warranty and honestly I don't think they would realize the crossflash. It will give better frames in games for sure, but its obviously not a giant difference.


----------



## DaftConspiracy

Imprezzion said:


> Man these cards are a handful with the 2x8 pin cards limiting so hard lol.
> 
> If I just use a core clock offset I noticed the first part of the curve up to like the 1.006v part can handle as high as +165 fine but as soon as I play a lighter game that lets it stretch it's legs like World of Tanks the top part of the curve at 1.031v and up can't get nowhere close to +165.. At like 1.081v+ the card can barely handle +60 so it seems to hit a frequency / voltage wall at a rather weird point.
> 
> I have to put some work into making a full custom curve tuning every single voltage point manually which is going to take forever..
> 
> It's so weird when looking at effective clocks as well. It seems to do like ~1970-1980Mhz effective around 0.987-1.000v fine but even a small increase to say, 2030Mhz effective is nowhere near stable even at 1.062v. It really looks like a sort of voltage wall that CPU's seem to have above certain frequencies.
> 
> Maybe I should just hard limit the curve to say, 0.800v up to 1.006v @ +165 and lock all points above that to the same clockspeed or maybe 1-2 bins above it from 1.050v and up or something.
> 
> Real shame, this card has potential bench wise for the core clock but it just doesn't wanna run games stable at anything above 2055Mhz ish (effective ~ 2030) even tho it will pass lighter benchmarks that don't hit power limit as high as 2145Mhz (2115 effective).
> 
> It 100% ain't temperature related. Core doesn't exceed 56c, hotspot 65c, junction 82c so it's not like I'm pushing 70c+ or anything. And even if I do, I set the fans to 500 RPM up to 85c, it will still maintain the same effective clocks without getting unstable.


Really? I found the opposite with mine, it scales excellently with voltage. The higher the voltage the more offset I can run. 

Sent from my IN2025 using Tapatalk


----------



## Imprezzion

DaftConspiracy said:


> Really? I found the opposite with mine, it scales excellently with voltage. The higher the voltage the more offset I can run.
> 
> Sent from my IN2025 using Tapatalk


All logic points to that but I tested it with CP2077 for example. Running 1080P with a high DLSS (balanced or performance) will drop the load low enough to not hit the power limit. Anything above 2055Mhz (~2030Mhz effective) will crash in minutes even on the full 1.100v. I blamed CP2077 for it first but tested it in world of tanks as well with a FPS cap so that it doesn't hit power limit and that too crashes really fast above 2055Mhz.

It does do fine at +120 at 0.987v in both situations (which is 1980-2010Mhz) so even the super small difference of like barely 50Mhz does not respond at all to a full 0.1v more. It's so weird...


----------



## BluePaint

Imprezzion said:


> crashes really fast above 2055Mhz.


Is it an all poscap design or does it have mlccs?


----------



## Imprezzion

BluePaint said:


> Is it an all poscap design or does it have mlccs?


All SP unfortunately. It is a full custom PCB but with "cheap" components unfortunately.


----------



## DaftConspiracy

Imprezzion said:


> All SP unfortunately. It is a full custom PCB but with "cheap" components unfortunately.


Make sure you have a good mount, hotspots will cause this more than the caps. Word is the caps only help with stability if the voltage is constantly adjusting (from bouncing off power limit). I've seen full pos cap cards run at 2200mhz "stable." This is the curve I run, it's completely stable throughout the entire curve. At 1.068v it'll hit 2100mhz, at 1.082v it'll hit 2115mhz, and at 1.093v it'll hit 2130mhz.

Sent from my IN2025 using Tapatalk


----------



## Imprezzion

Mount should be fine, I re-did the mount several times with both normal paste and now Liquid Ultra. It never gets above 58c core 66c hotspot according to HWINFO64. Behavior never changed between any of the mounts.

It's not like it performs bad now, I mean, 1980-2025Mhz MSI AB (1950-1980 effective) is still a healthy OC and with the voltage staying around 0.987 +- 1 or 2 bins it is very stable and +1250 memory definitely helps as well.

By the way, can anyone confirm or deny whether the fans of a 3080 eat from the power budget of the card or not? They did on RTX2xxx and I don't run the fans off the card now but I didn't do any repeatable tests tbh.


----------



## Rik_IV

ducegt said:


> You can always flash back to resume the warranty and honestly I don't think they would realize the crossflash. It will give better frames in games for sure, but its obviously not a giant difference.


Yeah I guess you're probably right. Were you stable in games with that 13k port royal score setup? And what +clock and mem did you OC? What were the aprox gains like 4/5 fps or more 10fps on average?


----------



## marashz

Imprezzion said:


> By the way, can anyone confirm or deny whether the fans of a 3080 eat from the power budget of the card or not? They did on RTX2xxx and I don't run the fans off the card now but I didn't do any repeatable tests tbh.


Someone said 2-7 pages back, that he got 40W by removing fans / leds if I remember correctly. But let's say I look at ny hwinfo, turn on / off leds, fans off / 100%, power always about same, so I have no idea how other guy got -40W by that.


----------



## Imprezzion

marashz said:


> Someone said 2-7 pages back, that he got 40W by removing fans / leds if I remember correctly. But let's say I look at ny hwinfo, turn on / off leds, fans off / 100%, power always about same, so I have no idea how other guy got -40W by that.


That was me but I didn't do any repeatable test with the same loads, just speculation and non-repeatable tests with a different BIOS so I realize now that that test is invalid in so many ways lol..

I got a spare fan now from a 2080 Ti Windforce OC cooler that I bought from a guy locally who waterblocked it so I can repair my stock cooler now. I might just do a repeatable test now.


----------



## EarlZ

I just replaced some of the stock thermal pads on my Aorus Master Rev2 and I am very pleased with the result which is 30c lesser in T-junction Temps. I also replaced the stock pads on the back side of the VRM and added thermal pads for the GDDR6X area.

I used both 2mm and 3mm by Gelid and I noticed they are far more stiffer than the stock pads, My main concern is that I did not replace the stock pad that was on the GPU back side for the large black caps and I can no longer validate if the stock pads are touching those caps. As gigabyte cards do not have an exposed gpu back do I risk overheating those ?


----------



## Imprezzion

EarlZ said:


> I just replaced some of the stock thermal pads on my Aorus Master Rev2 and I am very pleased with the result which is 30c lesser in T-junction Temps. I also replaced the stock pads on the back side of the VRM and added thermal pads for the GDDR6X area.
> 
> I used both 2mm and 3mm by Gelid and I noticed they are far more stiffer than the stock pads, My main concern is that I did not replace the stock pad that was on the GPU back side for the large black caps and I can no longer validate if the stock pads are touching those caps. As gigabyte cards do not have an exposed gpu back do I risk overheating those ?


So, those stock white pads are really that bad?
I have some Arctic blue 3mm pads that I can slap on my 3080 Gaming OC. My core and hotspot is super low now at 58c core 66c hotspot with Liquid Ultra but my Junction is still 80-86c in gaming, even with a backplate with pads.


----------



## ducegt

Rik_IV said:


> Yeah I guess you're probably right. Were you stable in games with that 13k port royal score setup? And what +clock and mem did you OC? What were the aprox gains like 4/5 fps or more 10fps on average?


13k is not stable. My gaming stable setup just ran PR 12,601. It's a custom curve where most of it is +150 (of Suprim curve) and is +195 at 1.1v. Mem +799. It really depends on the game. If it's a game that's doesn't need the higher PL which is many games, there may be no gain, but there are some games that pull the full 430w. I would guess it's more like 4/5 fps rather than 10, but really we should be talking about percentages.


----------



## EarlZ

Imprezzion said:


> So, those stock white pads are really that bad?
> I have some Arctic blue 3mm pads that I can slap on my 3080 Gaming OC. My core and hotspot is super low now at 58c core 66c hotspot with Liquid Ultra but my Junction is still 80-86c in gaming, even with a backplate with pads.


Prior to the pad replacement I was getting 106c in cyberpunk2077, around 100-102c on other games,28-30c drop na after the pad replacement. Since you are already getting 80-86, might not be worth changing it out anymore.


----------



## obscurehifi

marashz said:


> Someone said 2-7 pages back, that he got 40W by removing fans / leds if I remember correctly. But let's say I look at ny hwinfo, turn on / off leds, fans off / 100%, power always about same, so I have no idea how other guy got -40W by that.


I've posted my numbers previously but my LEDs pull about 3w and the difference between my AIO fans on auto and 100% is 8w. My aurous pretty clearly reserves power for them, it's either 370W one way or 360W the other. 

Sent from my SM-G973U using Tapatalk


----------



## Clukos

The FE is really power limited in most benches :/

It looks like shunt mod is the only way to go as no other board shares the 12-pin power input.

I'd probably think about it if these things weren't so damn unavailable. Nvidia ought to let us bios flash these things and remove the limits, I don't see why they hard lock them considering the amount of people that would mod their BIOS and flash it is relatively tiny in the large scheme of things.


----------



## lordzed83

Clukos said:


> The FE is really power limited in most benches :/
> 
> It looks like shunt mod is the only way to go as no other board shares the 12-pin power input.
> 
> I'd probably think about it if these things weren't so damn unavailable. Nvidia ought to let us bios flash these things and remove the limits, I don't see why they hard lock them considering the amount of people that would mod their BIOS and flash it is relatively tiny in the large scheme of things.


to reduce RMS's form people frying cards cause they got no clue *** they ding. Voltage limit i do get byut pwoer limits is what i HATE. BTW AMD same thing blocked 2 xxxxs


----------



## edhutner

Today I received Alphacool Eisblock water block for my 3080 MSI Suprim X.
I intend to put it tonight, along with liquid metal on the gpu. I know about isolating gpu die surroundings.
Is there any other things or "underwater stones" that I have to take care for this specific block and card?

Thanks in advance


----------



## BluePaint

edhutner said:


> Today I received Alphacool Eisblock water block for my 3080 MSI Suprim X.
> I intend to put it tonight, along with liquid metal on the gpu. I know about isolating gpu die surroundings.
> Is there any other things or "underwater stones" that I have to take care for this specific block and card?


I just put my Trio (besides some more phases, the PCB is identical to the Suprim) under the Eisblock and used higher quality pads for VRAM on the front and also added another 1mm of pads on top of the supplied 2mm alphacool pads to the back of the VRAM (because the back of the core has 3mm alphacool pads and the vram back only 2mm and the electronics of the core are already sticking out higher than the VRAM)

lordzed83 has a comparison of std alphacool pads and gelid 15w which is why i went with better pads as well:
[Official] NVIDIA RTX 3080 Owner's Club


----------



## edhutner

Thank you.
Currently I dont have 3mm pads. I intended to order gelid extremes, but they were out of stock.

Do you think that it is ok to use the origianl msi backpalte pads for the memory, because they are probably 3mm (they look thicker then alphacool ones)?

The other option is, I have avaliable some nonames 1mm and 2mm pads, and can put 1 mm over the alphacools.


----------



## lordzed83

@BluePaint temps stabilised after what 2 weeks. Gotta pay for itself like all my previous gpus did sicen cold in Notts also keeps my room warm so saves on heating when not gaming.








With those Mining settings


----------



## lordzed83

edhutner said:


> Thank you.
> Currently I dont have 3mm pads. I intended to order gelid extremes, but they were out of stock.
> 
> Do you think that it is ok to use the origianl msi backpalte pads for the memory, because they are probably 3mm (they look thicker then alphacool ones)?
> 
> The other option is, I have avaliable some nonames 1mm and 2mm pads, and can put 1 mm over the alphacools.


well with block u got pads for everything just mem temps will be 75c+ on them pads.


----------



## BluePaint

edhutner said:


> The other option is, I have avaliable some nonames 1mm and 2mm pads, and can put 1 mm over the alphacools.


The back is not so important and can be done again easily, so maybe just use alphacool pads and maybe your 1mm noname on top for vrams on back (seemed to make sense for me)


----------



## wuttman

lordzed83 said:


> temps stabilised after what 2 weeks.


Nice results, still 20C+ degrees delta with gpu and waterblock itself. Copper shims could've brought it even lower. Here's my take on it: Guide on improving gddr6\gddr6x vram thermals with...


----------



## edhutner

Here are my initial results air vs water (alphacool) on 3080 Suprim X.

I added 1 mm noname pads over the original 2mm pads on the backplate. I think it is little too tick cause backplate looked kind of convex after I tightened the bolts, but little hand pressure fixed that minor issue 
The other issue was with the rear bracket, originally it is screwed to the pcb and the cooler plastic frame and is rock solid. But with the block there is only the two screws to the pcb and is not so solid. May be Alphacool should have thought about this too.

Other than that I am satisfied.








Most of the numeric values are avg-max, single values are avg. UPS pow is total system usage (including monitor) reported by the cyberpower ups and probably is not very accurate.
After the block installation, water flow rate of the system decreased with about 65lph. Pump is always 100%.

Overall the performance covers my expectations.

Now I can start raising the power limit higher and play with voltage and clocks.


----------



## lordzed83

@wuttman well think same thing could be done by using carbon conductive thermal pads ??


----------



## Imprezzion

Is there a way to lock a 3080 in full 3D P State under load? Some games that I play like World of Tanks are so light the card hits 600+ FPS and slams the power limit super hard. Now, you don't need 600 FPS in WoT so I enabled V-Sync / G-Sync / whatever it is today for 144Hz lock (with triple buffering). This now means the card tends to run like ~1020-1300Mhz not even going off idle voltage which is nice but the memory is going full spaz now clock wise. The memory is going from 5000-10750 and back multiple times a second creating some weird stuttering and jumpy memory temperatures. I just want the memory to stay at max clocks...


----------



## mouacyk

Imprezzion said:


> Is there a way to lock a 3080 in full 3D P State under load? Some games that I play like World of Tanks are so light the card hits 600+ FPS and slams the power limit super hard. Now, you don't need 600 FPS in WoT so I enabled V-Sync / G-Sync / whatever it is today for 144Hz lock (with triple buffering). This now means the card tends to run like ~1020-1300Mhz not even going off idle voltage which is nice but the memory is going full spaz now clock wise. The memory is going from 5000-10750 and back multiple times a second creating some weird stuttering and jumpy memory temperatures. I just want the memory to stay at max clocks...


Try CTRL+L in MSI AB. Not sure what the EVGA Precision equivalent is.


----------



## wuttman

lordzed83 said:


> @wuttman well think same thing could be done by using carbon conductive thermal pads ??


If you stack them together? Dunno, may be. Aren't they too expensive though?


----------



## lordzed83

wuttman said:


> If you stack them together? Dunno, may be. Aren't they too expensive though?


In the daay where low end cards start at 480bucks nothing is too expensice


----------



## Imprezzion

Ok, what? I thought CP2077 was a tough game to run power limit and load wise.. I was wrong.. I just started playing Control for the first time... That game. My god. The poor 2 x 8 pin 3080 with 370w power limit cannot even hold 1950Mhz in that game. It drops all the way to 0.925-0.931v and 1920-1935Mhz setpoint with barely at or above 1900Mhz effective clocks and it;s dropping voltage so low it will actually crash the driver after 10-15 min of playing at that offset so I doubt it will even hold 1900Mhz at all. 

What is the point of a 320w model then. I mean, if I can't even keep it above 1900Mhz with 370W on 1080P max settings with RTX High / Ultra on CP2077 and Control then how does a 320w 3080 even do it's base clocks at 4K with these games. It would throttle into oblivion and struggle to even do 1700Mhz I guess?


----------



## lordzed83

Imprezzion said:


> Ok, what? I thought CP2077 was a tough game to run power limit and load wise.. I was wrong.. I just started playing Control for the first time... That game. My god. The poor 2 x 8 pin 3080 with 370w power limit cannot even hold 1950Mhz in that game. It drops all the way to 0.925-0.931v and 1920-1935Mhz setpoint with barely at or above 1900Mhz effective clocks and it;s dropping voltage so low it will actually crash the driver after 10-15 min of playing at that offset so I doubt it will even hold 1900Mhz at all.
> 
> What is the point of a 320w model then. I mean, if I can't even keep it above 1900Mhz with 370W on 1080P max settings with RTX High / Ultra on CP2077 and Control then how does a 320w 3080 even do it's base clocks at 4K with these games. It would throttle into oblivion and struggle to even do 1700Mhz I guess?


And ?? You could have looked up reviews on avg clocks....


----------



## Imprezzion

lordzed83 said:


> And ?? You could have looked up reviews on avg clocks....


I think you're getting it wrong. I'm perfectly happy with my cards performance, bit shocked it failed +120 in Control but still happy. But. My point is, there are models with 320w power limits out there. My card is 370w and I already don't run RGB or fans even off the card to keep as much power as it can for the card itself. I just don't understand how nVidia or board partners can justify making a 320w model of this card when clearly even a 370w card cannot even get anywhere near it's rated voltage at any clock speed in certain situations. The cards are rated for 1.100v max with a presumably nominal voltage of ~1.062v. No model of the current cards that don't have a 3x8 pin can handle this. At all. Throttling to below 1.000v is normal. That seems like a very very weird design choice and it makes adjusting clocks ery difficult as there's never one single testing point. Both for us as overclocked and for board part er factory OC's.

It really makes me wonder what this card could've been clock speed wise if it was just capable of running the full 1.100v all the time without being so extremely power limited. I think even a 3x8pin 450w card cannot actually do this, can it? Probably like stock 2010Mhz boost and like 2200+ actual?

This makes me wanna shunt mod it so bad.. I have the resistors just laying here on my desk an my soldering station is looking at me funny but knowing I cannot possibly replace the card in case something goes wrong means I won't do it yet.. as soon as either 3080 or 3080 Ti or whatever is actually widely available again for a normal price I will 100% shunt it and just go for the moon with this thing. Bykski full cover block, Arctic Accelero IV backplate, power limit to the moon and full 1.100v curve.


----------



## lordzed83

Imprezzion said:


> I think you're getting it wrong. I'm perfectly happy with my cards performance, bit shocked it failed +120 in Control but still happy. But. My point is, there are models with 320w power limits out there. My card is 370w and I already don't run RGB or fans even off the card to keep as much power as it can for the card itself. I just don't understand how nVidia or board partners can justify making a 320w model of this card when clearly even a 370w card cannot even get anywhere near it's rated voltage at any clock speed in certain situations. The cards are rated for 1.100v max with a presumably nominal voltage of ~1.062v. No model of the current cards that don't have a 3x8 pin can handle this. At all. Throttling to below 1.000v is normal. That seems like a very very weird design choice and it makes adjusting clocks ery difficult as there's never one single testing point. Both for us as overclocked and for board part er factory OC's.
> 
> It really makes me wonder what this card could've been clock speed wise if it was just capable of running the full 1.100v all the time without being so extremely power limited. I think even a 3x8pin 450w card cannot actually do this, can it? Probably like stock 2010Mhz boost and like 2200+ actual?
> 
> This makes me wanna shunt mod it so bad.. I have the resistors just laying here on my desk an my soldering station is looking at me funny but knowing I cannot possibly replace the card in case something goes wrong means I won't do it yet.. as soon as either 3080 or 3080 Ti or whatever is actually widely available again for a normal price I will 100% shunt it and just go for the moon with this thing. Bykski full cover block, Arctic Accelero IV backplate, power limit to the moon and full 1.100v curve.


Man You niot looking hard enough Scaling is absolute GARBAGE





Theer 800w Benchmarks My card can do 2220 stable and thats GOOD bin/silicone


----------



## acoustic

I don't know if I'd bother shunt modding.

Another game to try that will obliterate your power limit ... Metro Exodus. I'm sure it'll be even worse when the big DLSS 2.0 update comes out with the added RTX reflections. That card keeps my 3080 FTW3 pegged at 430w+ the entire time. Doesn't drop under 400w ever at 3840x1600 lol


----------



## MrKenzie

acoustic said:


> I don't know if I'd bother shunt modding.
> 
> Another game to try that will obliterate your power limit ... Metro Exodus. I'm sure it'll be even worse when the big DLSS 2.0 update comes out with the added RTX reflections. That card keeps my 3080 FTW3 pegged at 430w+ the entire time. Doesn't drop under 400w ever at 3840x1600 lol


Looking back at a post I made here about Metro Exodus, it looks like my card at 450W could only sustain 2000MHz effective clocks, 170MHz lower than most other games. It certainly is a power hungry game!


----------



## Imprezzion

Yup, I just tried it. It's about as bad as Control on my card. Division 2 for example hangs around 1960-1980 effective. Control 1900-1920 and Exodus 1890-1900. All just sitting flatlined at 355-360w constant power. 59c core, 70c hotspot 88c junction. (Stock air-cooler + Liquid Ultra + no shroud / stock fans + 2x 140mm fan).


----------



## Clukos

Finally decided to put liquid metal!

20K GPU timespy: I scored 18 664 in Time Spy
12870 PR: I scored 12 870 in Port Royal

I wonder how much more I can push out of this thing with the 370W BIOS.


----------



## Peter Watson

Clukos said:


> Finally decided to put liquid metal!
> 
> 20K GPU timespy: I scored 18 664 in Time Spy
> 12870 PR: I scored 12 870 in Port Royal
> 
> I wonder how much more I can push out of this thing with the 370W BIOS.


I can't get anywhere near 20k my evga ftw3 ultra with 450w bios is trash, I'm only keeping it because I can't get another one, My best score.. I scored 18 821 in Time Spy


----------



## Imprezzion

3080 Suprim X in stock locally.. €2099...






Megekko.nl - MSI GeForce RTX 3080 SUPRIM X 10G Videokaart


Koop nu de MSI GeForce RTX 3080 SUPRIM X 10G Videokaart bij Megekko. Voor 22:30 besteld, morgen in huis! Ook op zondag!




www.megekko.nl





I swear these shops are just as guilty of scalping customers as the average eBay scalper.. horrible..


----------



## BluePaint

Problem


Peter Watson said:


> I can't get anywhere near 20k my evga ftw3 ultra with 450w bios is trash, I'm only keeping it because I can't get another one, My best score.. I scored 18 821 in Time Spy


There must be something wrong in the OS or nvidia configuration. The measured values of 2240 Mhz avg Mhz @ 20C and decent VRAM speed paired with a 10850k > 5Ghz (CPU score seems right) and 4000Mhz RAM should give you something like 20500-21000 GPU score. Something doesn't add up.
Did u check make sure that your VRAM is not in error correction mode (faster frequency but lower fps/score)? Is your pcie in 16 mode? Whats your PR score?


----------



## Imprezzion

Those scores are the reason I want to shunt my card lol. Or just get a 3x8 pin at a "normal" price. My 2x8 pin 370w card only gets like 18600 at best..


----------



## acoustic

Imprezzion said:


> Those scores are the reason I want to shunt my card lol. Or just get a 3x8 pin at a "normal" price. My 2x8 pin 370w card only gets like 18600 at best..


Amounts to nothing in real-world use though.


----------



## BluePaint

acoustic said:


> Amounts to nothing in real-world use though.


Maybe not much in fps but there comes a great feeling of freedom when there is no PL, as with a 1080ti and xoc bios  . Just open AB curve editor, chose any point on the voltage curve and it will stay there transfixed


----------



## Peter Watson

BluePaint said:


> Problem
> 
> There must be something wrong in the OS or nvidia configuration. The measured values of 2240 Mhz avg Mhz @ 20C and decent VRAM speed paired with a 10850k > 5Ghz (CPU score seems right) and 4000Mhz RAM should give you something like 20500-21000 GPU score. Something doesn't add up.
> Did u check make sure that your VRAM is not in error correction mode (faster frequency but lower fps/score)? Is your pcie in 16 mode? Whats your PR score?


My PR score is terrible, card is solid stable but just low scores, i'm about 9fps down. it was fine when I first got it, it could do 12850 pr score on air, now 4 months later I hit around 11900, all temps are super cool. I'm thinking of taking the water block off to see if it's that, I was hoping that it was my motherboard so I changed from a i9 9900k to a i9 10850k, and there is no difference, both chips clock the same at 5.5ghz. only thing I've not changed is my psu which is quite old.


----------



## MrKenzie

Peter Watson said:


> My PR score is terrible, card is solid stable but just low scores, i'm about 9fps down. it was fine when I first got it, it could do 12850 pr score on air, now 4 months later I hit around 11900, all temps are super cool. I'm thinking of taking the water block off to see if it's that, I was hoping that it was my motherboard so I changed from a i9 9900k to a i9 10850k, and there is no difference, both chips clock the same at 5.5ghz. only thing I've not changed is my psu which is quite old.


Are you sure gsync or something isn't turned on? In the past I've had a card score almost identical after 4 years of high overclock with shunt mods, they don't just degrade like some people think. There must be a setting that is causing the low scores.


----------



## BluePaint

Peter Watson said:


> I'm thinking of taking the water block off to see if it's that


Do u have sensors for vrm temps? And in hwinfo u can check gpu hotspot and vram junction temps. With my aqucool block i had the impression that some pads weren't the right size and changed them.


----------



## Peter Watson

BluePaint said:


> Do u have sensors for vrm temps? And in hwinfo u can check gpu hotspot and vram junction temps. With my aqucool block i had the impression that some pads weren't the right size and changed them.


All temps are low, it's just the card i've tried everything else other than a new psu, but if it was the psu I'm sure I would be having crashing issues etc.


----------



## Clukos

That GPU score looks really low for 2.2GHz + core

Maybe your memory is throttling.


----------



## Imprezzion

So, just out of curiosity, is 18600 a low score for TS Graphics when I'm running +1250 memory (not correcting) at an average of ~2025Mhz core (effective clock 1985-2010)


----------



## Nizzen

Imprezzion said:


> So, just out of curiosity, is 18600 a low score for TS Graphics when I'm running +1250 memory (not correcting) at an average of ~2025Mhz core (effective clock 1985-2010)


I got 2044mhz average 3090 strix on air. Result in Graphics Score 21 541.

Cpu was 11700k.


----------



## edhutner

@Nizzen 
I think topic it is about 3080 not 3090


----------



## Peter Watson

Clukos said:


> That GPU score looks really low for 2.2GHz + core
> 
> Maybe your memory is throttling.


Its just the card I have no clue why, I should be hitting over 13k in pr but it will only scrape very low 12k, at stock the card gets 10956 in port royal. Its not the end of the world just disappointing.

When I benchmark all temps on everything in my pc are below 50c, so there is definitely no temp issues.

I will have a look at stock cooler to see if it cooled anything my Waterblock doesn't. But other than that I'm stumped.


----------



## Imprezzion

Peter Watson said:


> Its just the card I have no clue why, I should be hitting over 13k in pr but it will only scrape very low 12k, at stock the card gets 10956 in port royal. Its not the end of the world just disappointing.
> 
> When I benchmark all temps on everything in my pc are below 50c, so there is definitely no temp issues.
> 
> I will have a look at stock cooler to see if it cooled anything my Waterblock doesn't. But other than that I'm stumped.


Do you see a very large difference between MSI AB's reported clock and HWINFO64's Effective Clock? If so that might be a good indicator of internal throttling. A bit of difference is normal but not more then ~30-40Mhz


----------



## BluePaint

Peter Watson said:


> Its just the card I have no clue why


Check your GPU utilization in 3dmark graphs. Should be close to 100
Thats one possible reason for high clocks but low scores.


----------



## BluePaint

Clukos said:


> Finally decided to put liquid metal!
> 
> 20K GPU timespy: I scored 18 664 in Time Spy
> 12870 PR: I scored 12 870 in Port Royal
> 
> I wonder how much more I can push out of this thing with the 370W BIOS.


Those are some great scores with 370W!


----------



## Peter Watson

BluePaint said:


> Check your GPU utilization in 3dmark graphs. Should be close to 100
> Thats one possible reason for high clocks but low scores.





BluePaint said:


> Check your GPU utilization in 3dmark graphs. Should be close to 100
> Thats one possible reason for high clocks but low scores.


I did a quick recording with HWinfo64 running but everything looks ok, don't know if anyone else can see any issues.


----------



## Imprezzion

Peter Watson said:


> I did a quick recording with HWinfo64 running but everything looks ok, don't know if anyone else can see any issues.


Effective clocks are sitting around 2115-2130Mhz, should be fine. I am quite surprised by the low as hell power draw tho. It's barely touching 370w load which is quite low for the load it's under at the given voltage and clocks.


----------



## Peter Watson

Imprezzion said:


> Effective clocks are sitting around 2115-2130Mhz, should be fine. I am quite surprised by the low as hell power draw tho. It's barely touching 370w load which is quite low for the load it's under at the given voltage and clocks.


Low wattage will prob be because of temps and time spy was in a windowed mode, it wont be pulling as much current


----------



## Clukos

BluePaint said:


> Those are some great scores with 370W!


I think I'm getting away with it due to the very good memory chips I got on this 3080 

The clock range is nothing special but not bad either, it usually hovers around 2.1GHz in benchmarks.

I wish there was a way to unlock the power without shunt mod, I sold my 1080 Ti and if I manage to kill the 3080 it'll be a long time before I can find another one at MSRP :/


----------



## BluePaint

Clukos said:


> I wish there was a way to unlock the power without shunt mod


Indeed! I skipped last generation. Was there such a thing for Volta?


----------



## Imprezzion

Yup, 2080 Ti had several unlocked BIOS. I ran the Kingpin XOC 2000w one on a reference PCB A chip. I saw 524w in Superposition 4K Optimized and game power draw was about 440-470w @ 2160Mhz 1.125v.


----------



## BluePaint

@Clukos
What did modify on your FE?
Managed to squeeze out 12635 points in PR from unmodified FE
100% fans + Window Mod = 40C @ + 1300 VRAM @ + 250 core = 2160max, 2110 avg


----------



## edhutner

What should be expected max stable core oc for siprim x 3080 on water? And what is best method for stability testing.

Currently in MSI AB I have voltage+100, power 116%, core+110mhz. It is stable in benches and gaming. Additionally I have run couple of hours MSI kombustor with artifact scanner.

Assetto Corsa Competizione is the game I play mostly and there I get about 2055-2070mhz and voltage limited at max 1.087v.

I have tried AB oc scanner. It made curve average +70 or +90mhz (dont remember). This average result is not very encouraging. And that makes me think that my manual +110mhz may surprise me with crash in some moment.

I want to go higher, but I want to be sure that it will be 100% stable. I dont want to crash during online race in ACC. This is why I nees kind of more emperical method to validate stability.


----------



## BluePaint

If stability is of most importance, just be conservative and back off your OC by 40 or 50Mhz and be safe. Don't think you will notice 1-2% less performance. Trying to squeeze out the last 1 or 2 % and going for absolute stability at the same time doesn't go together. 

I would also cap framerate which in addition to taxing the GPU less on average gives usually smoother gameplay if choosen correct (somewhat lower than peak framerate for most of the time).


----------



## Imprezzion

Probably around 2130 ish. It's a lottery. Only way to find out is to test it lol. I fixed my crashes in Control somehow. It's related to temperatures appearantly. On like, 65% fan speed with the core around 62c, hotspot 74c junction 86c it crashes with +105 but when I crank the fans to 100% with temps of core 55c hotspot 63c junction 74c it runs fine at +105 for hours. So, these cards do seem to be just as temperature sensitive as RTX2xxx was in some scenarios.

Still, even at max power limit 370w and with no fans or RGB power from the card it still limits hard in Control with no DLSS, 1080p, all max quality, 4x MSAA, RT maxed as well. At +105 it barely holds it above 1900Mhz with effective clocks sometimes even going under it to like ~1890Mhz and voltage sits around 0.943-0.962v most of the time. Performance is outstanding on the other hand, it never drops below 80 FPS and averages like 95 FPS in that game and it's unbelievably smooth with Ultra Low Latency mode enabled.

If I run a lighter game like BF4 @ 288 FPS cap (2x144) or world of tanks with 144Hz G-Sync enabled it runs around 2070-2085Mhz all the time at 1.087-1.100v and never hits power limits either. Effective clocks 2060-2080Mhz all the time.


----------



## BluePaint

Using last cold days with 0C air:
2nd Place FS-Ultra-GPU Leaderboard 12730
2nd Place FS-Extreme-GPU Leaderboard 25029


----------



## Clukos

BluePaint said:


> @Clukos
> What did modify on your FE?
> Managed to squeeze out 12635 points in PR from unmodified FE
> 100% fans + Window Mod = 40C @ + 1300 VRAM @ + 250 core = 2160max, 2110 avg


I have the EK waterblock on it with liquid metal, which allowed me to run a bit higher sustained clocks, I also got +1900 on the mem (that's the point where it stops giving me extra performance). Otherwise, I think the main limiting factor is the power limit atm.


----------



## arrow0309

Hiya, just bought a Bykski block for my 3080 Trio from AliExpress, this one exactly (without the bp anyway):









Bykski Full Coverage GPU Water Block and Backplate for MSI RTX 3080/3090 GAMING X TRIO (N-MS3090TRIO-X)


The Bykski Full Coverage GPU Water Block for MSI RTX 3080/3090 GAMING X TRIO is a work of art! This full coverage block directly cools the video cards core (GPU), graphics memory (RAM), and voltage regulator module (VRM). By directly cooling these components you can achieve great temperatures...




www.bykski.us





And I was wondering if someone can post / link a manual, I just want to see the thermal pads like dimensions, especially those 2 slim stripes on the mosfets. 
Or if anyone used it, I'd like to know if they were 0.5mm or 1mm thickness.
So I may order better ones.
On the memory ic's and / or some other coils / poscaps zones they should probably be like 1mm, I can use those bundled or find something inside my house.
But I'd like to provide better ones for the mosfet, I don't know what are they bundling with this cheap block (paid £85, taxes included).
Thanks


----------



## jura11

@arrow0309 

Bykski using 1.2-1.25mm thermal pads, their w/mk rating I'm not sure but looks like they're 4w/mk 

Running Bykski waterblocks on my two RTX 3090 GamingPro and no issues, temperatures are great in 36-38°C in gaming and in rendering temperatures are in 30's (32-35°C), running on both RTX 3090's KPE XOC BIOS with power limit set to 65% and VRAM temperatures won't break 60°C on both GPUs with 1495MHz OC on VRAM and on other 1200MHz OC on VRAM 

I have built loop for friend with 10900k and RTX 3090 and we used 360mm radiator on top and 240mm radiator on bottom and in his loop we have seen 42-45°C as highest temperature on RTX 3090 with KFA2 390W BIOS 

If you will gain something running better thermal pads like Thermalright Odyssey, I have no idea because I didn't tried yet better pads on my GPUs 

Hope this helps 

Thanks, Jura


----------



## arrow0309

jura11 said:


> @arrow0309
> 
> Bykski using 1.2-1.25mm thermal pads, their w/mk rating I'm not sure but looks like they're 4w/mk
> 
> Running Bykski waterblocks on my two RTX 3090 GamingPro and no issues, temperatures are great in 36-38°C in gaming and in rendering temperatures are in 30's (32-35°C), running on both RTX 3090's KPE XOC BIOS with power limit set to 65% and VRAM temperatures won't break 60°C on both GPUs with 1495MHz OC on VRAM and on other 1200MHz OC on VRAM
> 
> I have built loop for friend with 10900k and RTX 3090 and we used 360mm radiator on top and 240mm radiator on bottom and in his loop we have seen 42-45°C as highest temperature on RTX 3090 with KFA2 390W BIOS
> 
> If you will gain something running better thermal pads like Thermalright Odyssey, I have no idea because I didn't tried yet better pads on my GPUs
> 
> Hope this helps
> 
> Thanks, Jura


Hello mate, thank you for the comprehensive reply.
It's OK then, I'll stick to those bundled in that case, no worries.
One last question, have you applied them only to the VRAM and to the 2 main mosfet stripes (like they're indicating) or did you put 2 more on the chokes and the poscaps (left side only)?


----------



## mouacyk

@jura11 and @arrow0309 Doesn't the pad thickness depend on who made the card? Palit may not have the same height offset as MSI when it comes to VRAM and VRMs. For example, my card is Gigabyte Eagle OC and all pads are 1mm, the cheapo 2-3 W/mk type.


----------



## jura11

arrow0309 said:


> Hello mate, thank you for the comprehensive reply.
> It's OK then, I'll stick to those bundled in that case, no worries.
> One last question, have you applied them only to the VRAM and to the 2 main mosfet stripes (like they're indicating) or did you put 2 more on the chokes and the poscaps (left side only)?


Hi there 

No worries there, yes I have applied thermal pads only on indicated places(2 main mosfet stripes and VRAM and I placed them too on POSCAP), no I didn't put them on the chokes and not sure if that would make any difference in temperatures

In something like mining VRAM temperatures would be off course with 1200MHz OC VRAM higher, in my case I have seen something around 72°C as max on top GPU and on bottom at 82°C, that's VRAM temperatures and core temperatures been in 32-36°C with 105MHz 

Hope this helps 

Thanks, Jura


----------



## arrow0309

mouacyk said:


> @jura11 and @arrow0309 Doesn't the pad thickness depend on who made the card? Palit may not have the same height offset as MSI when it comes to VRAM and VRMs. For example, my card is Gigabyte Eagle OC and all pads are 1mm, the cheapo 2-3 W/mk type.


Not exactly since the waterblock is also custom product and can be made with identical or different offsets.



jura11 said:


> Hi there
> 
> No worries there, yes I have applied thermal pads only on indicated places(2 main mosfet stripes and VRAM and I placed them too on POSCAP), no I didn't put them on the chokes and not sure if that would make any difference in temperatures
> 
> In something like mining VRAM temperatures would be off course with 1200MHz OC VRAM higher, in my case I have seen something around 72°C as max on top GPU and on bottom at 82°C, that's VRAM temperatures and core temperatures been in 32-36°C with 105MHz
> 
> Hope this helps
> 
> Thanks, Jura


Jeez, kinda high temps (for a water cooled hw) but also your vrams are rocking pretty high.
Mine are like 10k daily and a +105-120 on the gpu, also being a 3080 with less memory ic's should ensure lower temps all around.
So yeah, maybe I'll stick with their bundled pads as well.
Cheers mate


----------



## jura11

arrow0309 said:


> Not exactly since the waterblock is also custom product and can be made with identical or different offsets.
> 
> 
> Jeez, kinda high temps (for a water cooled hw) but also your vrams are rocking pretty high.
> Mine are like 10k daily and a +105-120 on the gpu, also being a 3080 with less memory ic's should ensure lower temps all around.
> So yeah, maybe I'll stick with their bundled pads as well.
> Cheers mate


Hi there 

I normal gaming I see 36-38°C and in rendering I see 32-33°C as max and VRAM temperatures in gaming won't break 60°C and in rendering they're in 60's all the time

I'm running 1495MHz on VRAM on first one and on second one I use 1200MHz like for rendering or benchmarks

Only in mining you see higher VRAM temperatures, normally hotspot is in low 40's and core temperatures are in 30's with KPE XOC BIOS capped at 65%(with that BIOS capped at 65% GPU pulls like 446W each) 

Hope this helps 

Thanks, Jura


----------



## Hiikeri

I get my MSI Suprim X 3080 yesterday. With *stock air cooler* + stock bios.

-3dMark Time Spy graphics score > 20.2k
-3dMark Port Royale > 13k









I scored 18 042 in Time Spy


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com












I scored 13 033 in Port Royal


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## BluePaint

@Hiikeri 
Looks like good chip!

Average clock speed in benchmarks and the temps look very similar to my Trio with Strixx BIOS and air cooler
Your [email protected] Ghz and > 4000Mhz RAM is worth a lot of points


----------



## edhutner

Wow great scores. I can not get these with my 3080 suprimx and it is watercooled. My last results are timespy graphics score - 19355, port royal - 12544.

Also I note that my gpu temperature is kind of wrong. Especially when idling it is sometimes shown a 1-2 degrees lower than my coolant temperature, and that is impossible.


----------



## Hiikeri

We are now investigating here in FInland at strange bios issue what i found about 2h ago.

Those my original SuprimX results, on my TimeSpy average Mhz was 2110Mhz

And my PortRoyal bench my average Mhz was 2123Mhz.

But whats a strange and issue now, i flashed my MSI card to Asus StrixOC 450W bios...

And i can increase my TimeSpy and PortRoyal average Mhz, AVG!!! MHZ about +50Mhz, but my 3dMark points are lower!!!

My PortRoyal avg. Mhz on now StrixOC 450W bios are even 2170Mhz, but my PR scores are still lower than my SuprimX bios @ avg. 2123Mhz

My VRAM speed was throttling on my SuprimX scores, but not on my ASUS bios results (Max.mem.clock +1375, avg. also +1375).

I think, and we are investigating this issue here on Finland, i get +50Mhz more GPU but my scores are still lower than my original MSI SuprimX (430W) bios...

At 1st sight, many of ours 3080 owners think that its maybe somekind ASUS cheat-Mhz issue, higher clocks without more power!!!

That could be explanation why most of youtube reviewers can drive Strix cards so high clocks. Usually SuprimX, FTW3, GB xTreme or Galax & Top-Tier cards can get only 2075-2100Mhz, and always StrixOC cards can get at same situation ~2135-2150Mhz! At least they reviewers run on theirs videos TombRaider... you know... On Asus cards theirs clocks are about +50Mhz higher than another brands 3080.

I scored 13 033 in Port Royal < avg.2123Mhz SuprimX 430W bios, 13033 scores (Vram +1366Mhz, slight throttling)
I scored 13 024 in Port Royal < avg.2170Mhz ASUS 450W bios, 13024 scores (Vram +1375Mhz, NO throttling)


----------



## Imprezzion

Effective Clock in HWINFO64 will tell the whole story.
These cards don't actually run the clockspeed MSI AB or 3DMark tells you it is.

Check with the stock BIOS what the effective clock is and with the Asus BIOS as well. Probably much lower on Asus.


----------



## Hiikeri

So ASUS bios are far away from real clocks ( effective clocks)?


----------



## BluePaint

I am using the Strix bios for the Trio since a couple of months since the Suprim bios wasn't available at that time. I also tried the FTW bios at that time but Strix seemed to perform a little better.
Haven't tried the Suprim bios yet since it has 20W less PL. Not sure I will have time the next days but I might have a look as well.


----------



## marashz

Just installed Heatkiller V on my XC3 3080. What's best bios for non shunt mod run?
2100MHz @ 1012mV perfectly fine on *Unigine Heaven*, but when I run PortRoyal, clock drops to 1920-1935MHz, with few spikes 1875MHz and 1950MHz. Pwr limit.
Even after I push power to 107%, feels like gpu throttles at 320W, with some spikes to 330-340W.
Is it still Asus TUF best for 2x 8pin cards, or something new?
==== EDIT
Best PortRoyal score 12,138 at 943mV (943-950mV good uptime 2025MHz in PR, 962mV clock lower, score lower, Pwr limit..).


----------



## Imprezzion

I run the XC3 BIOS on my 2x8 pin for that reason. The effective clocks on that 2x8 pin BIOS are higher then the TUF and the stock Gigabyte Gaming OC BIOS for me even tho it has 366w in stead of 370w.


----------



## mouacyk

Imprezzion said:


> I run the XC3 BIOS on my 2x8 pin for that reason. The effective clocks on that 2x8 pin BIOS are higher then the TUF and the stock Gigabyte Gaming OC BIOS for me even tho it has 366w in stead of 370w.


Were you randomly trying compatible BIOSes and discovered this? Interesting, I may have to try it out too, at the cost of loosing 1xHDMI that I'm not using.


----------



## Imprezzion

mouacyk said:


> Were you randomly trying compatible BIOSes and discovered this? Interesting, I may have to try it out too, at the cost of loosing 1xHDMI that I'm not using.


Quite literally this. I tried every single 2x8 pin BIOS on this Gigabyte Gaming OC and the XC3 performs by far the best. The stock BIOS is second and the TUF doesn't work well at all on this card. 

Results: 

Core set to 2010Mhz @ 0.987v with the respective offset for the BIOS that it runs with +1250 memory.

Division 2 all max 1080p Summit run 3 floors.
This game simulates hitting constant power limits. It requests way way more then 370w.

Stock Gaming OC average effective clock: ~1940Mhz. Reported power draw: 330-335w.
TUF average effective clock: 1920Mhz. Reported power draw: 335-340w
XC3 average effective clock: 1965Mhz. Reported power draw: 350-355w

All 3 are well below the setpoint because of power limits but the XC3 performs noticably better. It also seems to allow for slightly higher power draw.

Setpoint 2070Mhz @ 1.100v 2055 @ 1.081v

Cyberpunk 2077 all max 1080p RT Psycho DLSS Balanced. These settings with DLSS mean the load is much lower and thus it doesn't hit power limits at all.

Stock gaming OC: average effective clock 2040Mhz reported power 310w.
TUF: average 2030Mhz 320w.
XC3: 2060Mhz 320w.

Then as a last test with just a offset of +105, no curve so the card can determine its own clocks.

3DMark Time Spy average of 3 runs (GPU score only)
Stock: 18455, average reported clocks ~2025Mhz, average effective clocks 1990Mhz.
TUF: 18388, average reported clocks ~2010Mhz, average effective clocks 1980Mhz.
XC3: 18690, average reported clocks 2040Mhz, average effective clocks 2015Mhz.

So yeah, on my specific model's custom PCB (GB Gaming OC / Eagle / Vision) the XC3 BIOS for some reason runs way better. This might also.explaij why the XC3 (ultra) performs so far above other 2x8 pin cards in reviews despite having 4w lower power limit.

This might not apply to other cust PCB 2x8 pin model's!!


----------



## marashz

But I don't get why my XC3 Ultra can't reach 360W. Max hwinfo ever catched, 345W, but when I play or test and look at OSD, I see 320W all time.


----------



## mouacyk

marashz said:


> But I don't get why my XC3 Ultra can't reach 360W. Max hwinfo ever catched, 345W, but when I play or test and look at OSD, I see 320W all time.


Just to be clear @Imprezzion , there are *three* XC3 bioses on TPU. You must be talking about the plain "XC3" one, not the black nor ultra, correct?


----------



## Glottis

Imprezzion said:


> Stock: 18455, average reported clocks ~2025Mhz, average effective clocks 1990Mhz.
> TUF: 18388, average reported clocks ~2010Mhz, average effective clocks 1980Mhz.
> XC3: 18690, average reported clocks 2040Mhz, average effective clocks 2015Mhz.
> 
> So yeah, on my specific model's custom PCB (GB Gaming OC / Eagle / Vision) the XC3 BIOS for *some reason runs way better. *


We must have different understanding of what "way better" means. 1.27% difference between stock BIOS and XC3 BIOS might as well be margin of error.


----------



## marashz

Btw, first time water cooling gpu. Do thermal pads need some time to start doing it's job? I may have done something bit wrong, gpu chip temp max was 43C (but w/e, I saw other have under 40C, but I'm fine with 43C at 320W), but my memory 74C on gaming, and 100C on mining (240W aka 70% power limit and +800mem).
There was 0.5mm and 1mm thermal pads. Did everything as in manual, 0.5mm on memory, 1mm on other parts.
Also in two weeks I will get Grizzly Minus 1mm, can replace what Watercool gave me + change thermal paste on chip...


----------



## Imprezzion

mouacyk said:


> Just to be clear @Imprezzion , there are *three* XC3 bioses on TPU. You must be talking about the plain "XC3" one, not the black nor ultra, correct?


Ultra but afaik there's no real difference between them other then model number. All parameters are the same.

And yeah, I get what you're saying, it isn't "way better" but it does show a measurable improvement that holds up in game tests. Even tho it's small.


----------



## Jimmy2Shoes

Imprezzion said:


> Ultra but afaik there's no real difference between them other then model number. All parameters are the same.
> 
> And yeah, I get what you're saying, it isn't "way better" but it does show a measurable improvement that holds up in game tests. Even tho it's small.


Hey Buddy,

So I have a Gigabyte RTX 3080 Gaming OC and I see in your sig that you have it at +105/+1200. Is 105 stable for you in gaming or in PR.
In the past I used to use the fan curve to undervolt and overclock my 2080ti to 2100hmz.
I have applied the same principle with the 3080 but in learning about effective clockspeed I think I was running a placebo of a overclock on the 2080ti although it did satisfy me to see 2100 solid. It's sold now so I can't go back and test.
So now I am running with 60/1000 with no curve editor.
I am wondering with the fact that you have rerouted the fans power + using the XC3 bios how much of a difference % you think you have gained over stock settings.


----------



## Imprezzion

Jimmy2Shoes said:


> Hey Buddy,
> 
> So I have a Gigabyte RTX 3080 Gaming OC and I see in your sig that you have it at +105/+1200. Is 105 stable for you in gaming or in PR.
> In the past I used to use the fan curve to undervolt and overclock my 2080ti to 2100hmz.
> I have applied the same principle with the 3080 but in learning about effective clockspeed I think I was running a placebo of a overclock on the 2080ti although it did satisfy me to see 2100 solid. It's sold now so I can't go back and test.
> So now I am running with 60/1000 with no curve editor.
> I am wondering with the fact that you have rerouted the fans power + using the XC3 bios how much of a difference % you think you have gained over stock settings.


Oof that's hard to say. It's stable in everything but the clock it never the same since it throttles on power in most tests. In PR it only boosts to about 1980 on +105, but in for example BF4 it will do 2085Mhz at +105 just fine as it isn't power throttling.

The BIOS is more like, this works and is stable and isn't worse then stock so just going with it. It has dual BIOS anyway.

The fans from another source save about 10-15w I guess. Never really measured it with the 3080 bit I did on the 2080Ti I had and that worked out to about that. 

There's not much profit to get with the power limit we have so. Just use a offset to whatever seems stable in whatever you play.


----------



## Hiikeri

marashz said:


> Btw, first time water cooling gpu. Do thermal pads need some time to start doing it's job? I may have done something bit wrong, gpu chip temp max was 43C (but w/e, I saw other have under 40C, but I'm fine with 43C at 320W), but my memory 74C on gaming, and 100C on mining (240W aka 70% power limit and +800mem).
> There was 0.5mm and 1mm thermal pads. Did everything as in manual, 0.5mm on memory, 1mm on other parts.
> Also in two weeks I will get Grizzly Minus 1mm, can replace what Watercool gave me + change thermal paste on chip...


I think your pads are too thin.
Example Msi Suprim, eVga ftw3 and maybe Strix cards memory pads must be 2mm and pcb back side they must be 3mm that contact to backplate happens.

Powersupply pads are thinner.


----------



## Tristanguy1224

Hey... New 3080 owner and I have a question. I've flashed modded vBIOSes on to every card I've had for years now (Thanks again Mr. Dark for those 970 vBIOSes that got me into it) Anyway I'm hitting the power limit HARD on my card. I even used Afterburner to set a voltage curve and undervolted the crap out of and even running only like1920-1965 at ~987mV it bangs off the power limit crazy. I want to flash a higher TDP vBIOS to it. I have the MSI 3080 Ventus OC (I didn't get a choice what card to get I had to get the manager at the Micro Center I used to work at to pull it when it came on the truck) Anyway... What's the best one to flash to that card? It's only got 2x 8 pin but I used to regularly push 430+ watts through my 1080Ti XOC vBIOS.


----------



## marashz

Hiikeri said:


> I think your pads are too thin.
> Example Msi Suprim, eVga ftw3 and maybe Strix cards memory pads must be 2mm and pcb back side they must be 3mm that contact to backplate happens.
> 
> Powersupply pads are thinner.


I have checked stock thermal pads, memory are 2mm, other more like 2.5mm. Next week I will get Thermal Grizzly 0.5mm, 1mm and 1.5mm pads and I will recheck if there are any pressure signs on pads Watercool provided with Heatkiller V...
I had better memory temps while mining with stock ****ty XC3 air cooler, than now with MO-RA3 420 
===== EDIT
Oh, now when I thought, memory layout in manual, and on actual card, was a bit different, maybe that one chip has poor contact with block...
===== EDIT 2





Funny thing, on my XC3 Ultra memory has different layout, 4 side + 2 top + 3 sude + 1 bottom.


----------



## Imprezzion

Tristanguy1224 said:


> Hey... New 3080 owner and I have a question. I've flashed modded vBIOSes on to every card I've had for years now (Thanks again Mr. Dark for those 970 vBIOSes that got me into it) Anyway I'm hitting the power limit HARD on my card. I even used Afterburner to set a voltage curve and undervolted the crap out of and even running only like1920-1965 at ~987mV it bangs off the power limit crazy. I want to flash a higher TDP vBIOS to it. I have the MSI 3080 Ventus OC (I didn't get a choice what card to get I had to get the manager at the Micro Center I used to work at to pull it when it came on the truck) Anyway... What's the best one to flash to that card? It's only got 2x 8 pin but I used to regularly push 430+ watts through my 1080Ti XOC vBIOS.
> View attachment 2483283


EVGA XC3 Ultra or Gigabyte Gaming OC 370w BIOS. There's no way to get a 2x8pin above 370w with a BIOS. So far for me the XC3 Ultra BIOS performs the best and has the best I/O layout and fan percentages. Depending on the benchmarks / game even on +105 mine drops as low as 1920-1960 as well (CP2077 RT Psycho without DLSS enabled or Control with RT maxed).


----------



## noxyd

Hi guys, 

I am running an FTW3 with an EK block. 
I am also in a small form factor case (triple rads, but slim rads, obstructed airflow, restricted loop) so in short, cooling is less efficient than in a regular atx case. 

I'm running +80 core and +1200 memory with the 450W bios. 
Here is a screenshot during Furmark stress test. 

I find my Hot Spot temp pretty high! 

Are you guys seeing the same range ? 
Thanks!


----------



## Falkentyne

noxyd said:


> Hi guys,
> 
> I am running an FTW3 with an EK block.
> I am also in a small form factor case (triple rads, but slim rads, obstructed airflow, restricted loop) so in short, cooling is less efficient than in a regular atx case.
> 
> I'm running +80 core and +1200 memory with the 450W bios.
> Here is a screenshot during Furmark stress test.
> 
> I find my Hot Spot temp pretty high!
> 
> Are you guys seeing the same range ?
> Thanks!
> 
> View attachment 2483447


You should repaste.
Acceptable range delta should be 11C (perfect) to 17C (marginal).
You're at 36C delta. That means you don't have sufficient contact pressure on the die or you have a hot spot without proper paste on part of the die. Check your mount.
Best way to apply paste to these cores are like this, large X pattern with 4 small dots to ensure full coverage.



http://imgur.com/Xc1GEUO


Alternatively you can do this as once recommended by Thermaright for one of their thermal compounds.


http://imgur.com/VWLqHCL


----------



## EarlZ

Falkentyne said:


> You should repaste.
> Acceptable range delta should be 11C (perfect) to 17C (marginal).
> You're at 36C delta. That means you don't have sufficient contact pressure on the die or you have a hot spot without proper paste on part of the die. Check your mount.
> Best way to apply paste to these cores are like this, large X pattern with 4 small dots to ensure full coverage.
> 
> 
> 
> http://imgur.com/Xc1GEUO
> 
> 
> Alternatively you can do this as once recommended by Thermaright for one of their thermal compounds.
> 
> 
> http://imgur.com/VWLqHCL


Which thermalright compound is best for GPU what is the expected improvement say compared to Kryonaut ?


----------



## Falkentyne

EarlZ said:


> Which thermalright compound is best for GPU what is the expected improvement say compared to Kryonaut ?


Thermalright TFX (or Thermagic ZF-EX) is probably the best compound, followed by Mastergel Maker Nano (new batch version), IC Diamond (sometimes, viscosity is more important than pure w/mk numbers!), and Thermalright TF8.
The main issue is how convex the GPU core is, so you want a thick paste. That's why you saw people like Luumi and Jaytwocentz sand their die for LN2 runs--the convex die was causing issues.

For the person I replied to however, it has absolutely nothing to do with the paste he is using. He has a contact/balance/spread application problem.


----------



## Imprezzion

So, I kinda got lucky with mine I guess? I mean, with a convex die liquid metal TIM should have a very hard time getting good contact. I'm running Liquid Ultra on mine (stock Gigabyte Gaming OC cooler) and it wasn't a very generous application either but my temps are amazing. About 58c core and 67c hotspot on +105 / +1250 bouncing off the power limit so it can't physically get any hotter as it can't draw more power / current anyway.


----------



## noxyd

Falkentyne said:


> You should repaste.
> Acceptable range delta should be 11C (perfect) to 17C (marginal).
> You're at 36C delta. That means you don't have sufficient contact pressure on the die or you have a hot spot without proper paste on part of the die. Check your mount.
> Best way to apply paste to these cores are like this, large X pattern with 4 small dots to ensure full coverage.
> 
> 
> 
> http://imgur.com/Xc1GEUO
> 
> 
> Alternatively you can do this as once recommended by Thermaright for one of their thermal compounds.
> 
> 
> http://imgur.com/VWLqHCL


Thanks for this clear and detailed answer!
Looks like I found my mission for next week end! 
I have thermal grizzly kryonaut on hand (current paste is the stock provided by EK with the block).


----------



## EarlZ

Falkentyne said:


> Thermalright TFX (or Thermagic ZF-EX) is probably the best compound, followed by Mastergel Maker Nano (new batch version), IC Diamond (sometimes, viscosity is more important than pure w/mk numbers!), and Thermalright TF8.
> The main issue is how convex the GPU core is, so you want a thick paste. That's why you saw people like Luumi and Jaytwocentz sand their die for LN2 runs--the convex die was causing issues.
> 
> For the person I replied to however, it has absolutely nothing to do with the paste he is using. He has a contact/balance/spread application problem.


@Falkentyne For the 3080/3090's how much better would the TFX or Mastergel Maker Nano be in terms of temperatures, I plan change the thermal pads in the future and would gladly switch to the TFX from a Kryonaut.

Would you also know if the TFX is a lot more durable than Kyronaut as I've seen several posts about Kryonaut only good 9-12months before a repaste is needed.


----------



## Imprezzion

Oh my god. I got a notification the EVGA FTW3 Ultra was in stock in a local shop. Had a look. Yup, €2149... Shops are just as guilty scalping..


----------



## noxyd

Falkentyne said:


> You should repaste.
> Acceptable range delta should be 11C (perfect) to 17C (marginal).
> You're at 36C delta. That means you don't have sufficient contact pressure on the die or you have a hot spot without proper paste on part of the die. Check your mount.
> Best way to apply paste to these cores are like this, large X pattern with 4 small dots to ensure full coverage.
> 
> 
> 
> http://imgur.com/Xc1GEUO
> 
> 
> Alternatively you can do this as once recommended by Thermaright for one of their thermal compounds.
> 
> 
> http://imgur.com/VWLqHCL


So it turns out that you were absolutely right.
I repasted following the 1st picture you linked and not only my gap between hotspot and GPU temp is down to 14C, my GPU temp itself has also improved by 10C!

I’m not an expert but visually I couldn’t really see any obvious issue with my previous paste application.
One thing I noted though is that I may have over tightened the screws the first time (the washer where literally crushed against the pcb) so I tighten them more gently this time.

many thanks !


----------



## Falkentyne

noxyd said:


> So it turns out that you were absolutely right.
> I repasted following the 1st picture you linked and not only my gap between hotspot and GPU temp is down to 14C, my GPU temp itself has also improved by 10C!
> 
> I’m not an expert but visually I couldn’t really see any obvious issue with my previous paste application.
> One thing I noted though is that I may have over tightened the screws the first time (the washer where literally crushed against the pcb) so I tighten them more gently this time.
> 
> many thanks !


Good work. What did you do the first time? Did you spread it or did you do a dot in the middle?


----------



## EarlZ

@Falkentyne what kind of temperature improvement can we expected on the TFX coming from a Kryonaut, Would you be able to provide this detail before running off? I am very interested in switching paste sa Kryonaut is not that durable and usually requires a repaste each year.


----------



## marashz

EarlZ said:


> @Falkentyne what kind of temperature improvement can we expected on the TFX coming from a Kryonaut, Would you be able to provide this detail before running off? I am very interested in switching paste sa Kryonaut is not that durable and usually requires a repaste each year.


Haven't tried TFX myself, but I don't think you will win more than 1C in any scenario, unless you applied wrong other paste, or you go liquid metal.


----------



## Falkentyne

EarlZ said:


> @Falkentyne what kind of temperature improvement can we expected on the TFX coming from a Kryonaut, Would you be able to provide this detail before running off? I am very interested in switching paste sa Kryonaut is not that durable and usually requires a repaste each year.


I don't know. Try it and report your results. I don't have as good living conditions or cooling conditions as most of you.
TFX was 1-2C better on my 10900k vs Kryonaut. Maybe 1-2C better on my 3090. I have nothing new to add that I haven't already posted multiple times already.


----------



## noxyd

Falkentyne said:


> Good work. What did you do the first time? Did you spread it or did you do a dot in the middle?


I don't recall what I did in the first place. 
Here is a picture of yesterday when I removed the block. 
I can't really find an obvious spot missing paste.


----------



## arrow0309

jura11 said:


> Hi there
> 
> I normal gaming I see 36-38°C and in rendering I see 32-33°C as max and VRAM temperatures in gaming won't break 60°C and in rendering they're in 60's all the time
> 
> I'm running 1495MHz on VRAM on first one and on second one I use 1200MHz like for rendering or benchmarks
> 
> Only in mining you see higher VRAM temperatures, normally hotspot is in low 40's and core temperatures are in 30's with KPE XOC BIOS capped at 65%(with that BIOS capped at 65% GPU pulls like 446W each)
> 
> Hope this helps
> 
> Thanks, Jura


Hi Jura, arrived this afternoon, looks good enough quality wise , will probably start installing it tomorrow (also cleaning up all my rads and changing the tubes and the coolant).
The fact is I don't know where should I place all those 0.5mm black plastic spacers and also those tiny transparent washers.

Here are some photos of the block:


----------



## weleh

Hey guys,

So bought a 3070, sold it, bought a 6800XT, sold it, and now I have a 3080 Suprim X.

Trying to figure out what is the usual memory cap on these cards. I think the Suprim X comes with stock memory, no overclock.
My card is seeing performance uplift and no artifacting at 1400 offset which means it's running at 21.9 Gbit?

Is this normal for 3080s? On the other hand, the core is very weak maxing out at just about 115 offset which means it doesn't boost past 2100 Mhz.

Any answers appreciated regarding this.

Thanks


----------



## 6u4rdi4n

arrow0309 said:


> --snip--


Transparent washers between the screws and the PCB. I believe the black spacers goes between the PCB and the backplate if you got one.


----------



## arrow0309

6u4rdi4n said:


> Transparent washers between the screws and the PCB. I believe the black spacers goes between the PCB and the backplate if you got one.


This is the only image with the installation details, I'm gonna use my backplate (MSI original, the pic below):










Now where would those washers go?


----------



## 6u4rdi4n

arrow0309 said:


> --snip--


Looks like they refer to the washers as 0.5MM Gasket


----------



## jura11

arrow0309 said:


> Hi Jura, arrived this afternoon, looks good enough quality wise , will probably start installing it tomorrow (also cleaning up all my rads and changing the tubes and the coolant).
> The fact is I don't know where should I place all those 0.5mm black plastic spacers and also those tiny transparent washers.
> 
> Here are some photos of the block:
> 
> View attachment 2483944
> View attachment 2483945
> View attachment 2483946
> View attachment 2483947


Hi there 

I'm running two of these blocks on my RTX 3090 GamingPro's and no issues

These 0.5mm plastic spacers should be placed on acrylic standoffs, you will see where they needs to be placed there, just try dry mount without the thermal pads or TIM, for TIM I highly recommend Kryonaut or NT-H1 or Thermalright TFX 

Plastic washers you should place under spring screws underneath the backplate and you shouldn't have any issues

This manual should help you where to place washers etc 









Bykski Full Coverage GPU Water Block for Colorful iGame RTX 3080/3090 Vulcan / Neptune (N-IG3090VXOC-X)


Specifications Block Material: Electroplated High Purity Copper / Clear Acrylic Backplate Material: Black Anodized Aluminum (not in contact with coolant) Compatibility (Including but not limited to the following models) Colorful iGame RTX 3080 Vulcan X OC Colorful iGame RTX 3080 Vulcan OC 10G...




www.bykski.us





Some people use double washers which can improve temperatures but in my case I have used only single washer 

Hope this helps 

Thanks, Jura


----------



## Tom Base

I used my ROG-STRIX-RTX3080-O10G-GAMING with a Waterblock from EKWB w/o any issues. For some reason I like to revert the Watercooled solution to the original Fan based cooling. To achieve this I need to re apply Thermal Pads, which were destroyd during initial disassembly. 
To be sure, were to put which thickness of thermal pads on the ROG-STRIX-RTX3080-O10G-GAMING, I would like to know if there is a Manual or documentation, which shows exactly where I need to put which thermal pads on the front and backside of the PCB? Up to now I could not find a qualified answer (Pictures, Description, Manuals)

Any hint would be appreciated.
cheers
Tom


----------



## arrow0309

Tom Base said:


> I used my ROG-STRIX-RTX3080-O10G-GAMING with a Waterblock from EKWB w/o any issues. For some reason I like to revert the Watercooled solution to the original Fan based cooling. To achieve this I need to re apply Thermal Pads, which were destroyd during initial disassembly.
> To be sure, were to put which thickness of thermal pads on the ROG-STRIX-RTX3080-O10G-GAMING, I would like to know if there is a Manual or documentation, which shows exactly where I need to put which thermal pads on the front and backside of the PCB? Up to now I could not find a qualified answer (Pictures, Description, Manuals)
> 
> Any hint would be appreciated.
> cheers
> Tom


You can find some pics here:









Обзор видеокарты ASUS ROG STRIX GeForce RTX 3080 Gaming OC GreenTech_Reviews


Обзор и тестирование видеокарты ASUS ROG STRIX GeForce RTX 3080 Gaming OC




greentechreviews.ru





Don't you have the old ones still attached to the cooler?
It might have been useful to know their thickness.


----------



## EarlZ

Falkentyne said:


> I don't know. Try it and report your results. I don't have as good living conditions or cooling conditions as most of you.
> TFX was 1-2C better on my 10900k vs Kryonaut. Maybe 1-2C better on my 3090. I have nothing new to add that I haven't already posted multiple times already.


I would when I decide to repad my card, I dont think the stock pads can remain very efficient the second time around as I've already changed the stock paste as I was getting 85c even at a low 0.800v with 100% fan speed.


----------



## Tom Base

arrow0309 said:


> You can find some pics here:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Обзор видеокарты ASUS ROG STRIX GeForce RTX 3080 Gaming OC GreenTech_Reviews
> 
> 
> Обзор и тестирование видеокарты ASUS ROG STRIX GeForce RTX 3080 Gaming OC
> 
> 
> 
> 
> greentechreviews.ru
> 
> 
> 
> 
> 
> Don't you have the old ones still attached to the cooler?
> It might have been useful to know their thickness.


Most of the pads are still attached and I'm able to measure the thickness. some are completely destroyed, well I pulled them off before measuring - my bad. Thank you for the URL link anyway.


----------



## Colonel_Klinck

Guys anyone know the thermal pads thickness on the MSI Suprim? A mate grabbed one and wants to change them. Looking around it seems they are 1.5mm but just want to confirm.


----------



## EarlZ

Colonel_Klinck said:


> Guys anyone know the thermal pads thickness on the MSI Suprim? A mate grabbed one and wants to change them. Looking around it seems they are 1.5mm but just want to confirm.


2mm for the front GDDR6X


----------



## EarlZ

Falkentyne said:


> I don't know. Try it and report your results. I don't have as good living conditions or cooling conditions as most of you.
> TFX was 1-2C better on my 10900k vs Kryonaut. Maybe 1-2C better on my 3090. I have nothing new to add that I haven't already posted multiple times already.


Would you know if the TFX does not breakdown/push out at 80c like the kryonaut?


----------



## arrow0309

jura11 said:


> Hi there
> 
> I'm running two of these blocks on my RTX 3090 GamingPro's and no issues
> 
> These 0.5mm plastic spacers should be placed on acrylic standoffs, you will see where they needs to be placed there, just try dry mount without the thermal pads or TIM, for TIM I highly recommend Kryonaut or NT-H1 or Thermalright TFX
> 
> Plastic washers you should place under spring screws underneath the backplate and you shouldn't have any issues
> 
> This manual should help you where to place washers etc
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bykski Full Coverage GPU Water Block for Colorful iGame RTX 3080/3090 Vulcan / Neptune (N-IG3090VXOC-X)
> 
> 
> Specifications Block Material: Electroplated High Purity Copper / Clear Acrylic Backplate Material: Black Anodized Aluminum (not in contact with coolant) Compatibility (Including but not limited to the following models) Colorful iGame RTX 3080 Vulcan X OC Colorful iGame RTX 3080 Vulcan OC 10G...
> 
> 
> 
> 
> www.bykski.us
> 
> 
> 
> 
> 
> Some people use double washers which can improve temperatures but in my case I have used only single washer
> 
> Hope this helps
> 
> Thanks, Jura


Thanks mate, I have the both Kryonaut and the TFX, the second one seemed a little difficult to spread, tried it using the X + small centre line on my 10900K a few months ago.
I also have bought (recently) the newer batch of Cooler Master Maker Nano, put on the gpu (it's irrelevant on air cooling anyway) and you can spread it very easy.

Hmmmmm, I'll see, I can't go wrong with any of them


----------



## weleh

Suprim X cooler is ridiculous...

At 430W hotspot is at 70C and memory junction below 60C at 100% fan speed.
Shame my card doesn't have a strong core (2080 Mhz effective on gaming, Port Royal and TS can be done with a bit higher clocks but not stable ingame).
The memory though, overclocks like a champ. +1400 (22 Gbps effective) without regression. 50 Mhz more and it starts losing performance.


----------



## Ziver

I want upgrade my bios. I have Giga Xtreme 3080. They have 3 new bios , which one should i take ?


----------



## hubsahubsa

Ziver said:


> I want upgrade my bios. I have Giga Xtreme 3080. They have 3 new bios , which one should i take ?


"If your VBIOS version is:
F1, it can only be updated with VBIOS versions F2-F9.
F10, it can only be updated with VBIOS versions F11-F19.
F20, it can only be updated with VBIOS versions F21-F29."
You can see your version in GPU-Z. Advanced -> drop-down menu NVIDIA BIOS

I wonder if it's still fine to flash a BIOS from a different model. Currently i have an Eagle OC card with Gaming OC BIOS. Now they actually have 2 different files, one for flashing the OC BIOS and one for flashing the Silent BIOS.


----------



## weleh

I updated my Suprim X and it flashes both bios automatically.


----------



## edhutner

Suprim X have new BIOS? I guess it should be related to the resizable BAR option. Have anybody here benchmarked the impact of this new option?


----------



## weleh

MSI R_BAR BIOS AVAILABLE


Hey guys, MSI already realeased the vBIOS for Ampere to add R_BAR support. You need to download MSI LIVE UPDATE, scroll to BIOS section and click check. It will come up, click install. Open the file and it will self-extract the bios and other stuff. It will auto update you don't need nvflash...




www.overclock.net


----------



## OleMortenF

Do u guys know which BIOS version that enabled resizable BAR for the 3080 Suprim X?
I can only find VBIOS Version: 94.02.42.00.F9 in the MSI Live Update Version 162


----------



## weleh

When I looked up bios on Live Updater there was only 1 bios.

Do you have more than 1?


----------



## OleMortenF

Only 1 for me


----------



## weleh

Then that's the one.
It's a bios from today 30/03/2021


----------



## edhutner

I am trying latest MSI live update, but it doesn't show new bios for my 3080 SuprimX. Any idea how to get it?
Currently I am on "gaming" position from the hardware switch. Do I have to move it back to "silent" in order to get the new BIOS?


----------



## OleMortenF

You have to click the arrow pointer until the BIOS menu shows up


----------



## weleh

Arrow top right, click on it until you reach BIOS


----------



## edhutner

Aah thanks 
But still not finding bios for me


----------



## Glottis

I've seen seen some reports that BAR BIOS lowered TGP / OC potential of some cards. Anyone here noticed that behavior?


----------



## mouacyk

Wouldn't be surprised if all AIB's took advantage of this BIOS release to tune down BIOSes to alleviate RMA's, especially from potential GDDR6X failures. *Tin-foil hat on


----------



## cstkl1

F1 2020 , high, australia |dry |cycle
11900k+3080 strix both stock
disable 360fps
enabled 260fps


----------



## kairi_zeroblade

cstkl1 said:


> F1 2020 , high, australia |dry |cycle
> 11900k+3080 strix both stock
> disable 360fps
> enabled 260fps


isn't it the other way around?? you sure?? why get lower fps after the update?? though yeah I would also complain..for me its around 5-7fps gains on average for gaming aside from the better 1% lows too,,


----------



## rjrusek

Why does Gigabyte do this?

Obviously it does not matter which BIOS you install since all the cross flashing has been proven here...


----------



## rjrusek

hubsahubsa said:


> "If your VBIOS version is:
> F1, it can only be updated with VBIOS versions F2-F9.
> F10, it can only be updated with VBIOS versions F11-F19.
> F20, it can only be updated with VBIOS versions F21-F29."
> You can see your version in GPU-Z. Advanced -> drop-down menu NVIDIA BIOS
> 
> I wonder if it's still fine to flash a BIOS from a different model. Currently i have an Eagle OC card with Gaming OC BIOS. Now they actually have 2 different files, one for flashing the OC BIOS and one for flashing the Silent BIOS.


Please keep us posted on if successfully get up and running on a resizable BAR BIOS.


----------



## hubsahubsa

rjrusek said:


> Please keep us posted on if successfully get up and running on a resizable BAR BIOS.


Yes, it's working just fine. Updated motherboard BIOS, enabled "4G Decoding", set "Re-Size BAR Support" to auto, disabled "CSM Support", flashed the F22 Gaming OC "OC BIOS" on my Eagle OC and then updated graphics drivers.


----------



## rjrusek

hubsahubsa said:


> Yes, it's working just fine. Updated motherboard BIOS, enabled "4G Decoding", set "Re-Size BAR Support" to auto, disabled "CSM Support", flashed the F22 Gaming OC "OC BIOS" on my Eagle OC and then updated graphics drivers.


Where did you get the F22 Gaming OC "OC BIOS? How did you flash it?

Did you just use the file from the Gigabyte website? I thought that file would try to flash the DUAL bios since the Gaming OC has DUAL BIOS while Eagle OC does not.


----------



## hubsahubsa

rjrusek said:


> Where did you get the F22 Gaming OC "OC BIOS? How did you flash it?
> 
> Did you just use the file from the Gigabyte website? I thought that file would try to flash the DUAL bios since the Gaming OC has DUAL BIOS while Eagle OC does not.


Yeah, GeForce RTX™ 3080 GAMING OC 10G Support | Graphics Card - GIGABYTE Global 
"N3080GOL.F22"
Flashing was done simply by executing "N3080GOL.F22.exe". Took like 10 seconds


----------



## phoenixyz

Guys i have an msi gaming x tri o 3080. I flashed a strix 450 watt bios and it works perfectly. Now i want to update to the latest vbios with rebar support for strix. is it safe flashing this new bios with nvflash using the same old method in page 1 of this thread or do i need to use an updater tool. Have anyone flashed a different bios like strix rebar bios on msi 3080?


----------



## Broder

I have a Gigabyte Vision OC RTX 3080, the Resizable BAR BIOS update seems broken for me.

The first thing that caught my eye was the sheer number of different BIOS versions for the RTX 3080 Vision OC, there are 6 in total (F3, F12, F21, F31, F51 and F80). A note below states:

Please note:
You can only update to a VBIOS version of the same series.Visual C++2008 is required for this BIOS flash.
If your VBIOS version is:
F1, it can only be updated with VBIOS versions F2-F9.
F10, it can only be updated with VBIOS versions F11-F19.
F20, it can only be updated with VBIOS versions F21-F29.

So great, I get into GPU-Z to check my BIOS version, and this is what I find:
BIOS Version 94.02.26.80.3C

So I don't know what BIOS version I have. As it turns out, you must download Gigabyte AORUS Engine to find out your BIOS version (but they don't state that anywhere in the website), mine was F1. So for me, the update is the F3 BIOS.

I downloaded the new F3 BIOS (Resizable BAR) from the website, extracted the zip file. Once I try opening the N3080VO.F3.exe file, after going through Windows safety protocols, a window asks "Are you sure you update your Graphics BIOS?" (this isn't a typo), I click "Yes" and get the following message:

Windows cannot find 'C:\Users\xxx\AppData\Local\Temp\N3080VO.F3.exe'. Make sure you typed the name correctly, and then try again.

So I tried to manually extract all the files directly to the mentioned folder, this eliminates the error message, but the application gets stuck in "Are you sure you update your Graphics BIOS?" window. I click on "Yes", nothing happens and the window just comes back on, asking me the same thing. So I click "No" to get out and, eerily, I get a "Flash BIOS success ! Would you reboot the system ?" (again, no typo). I can click "Yes" to reboot my system, but obviously, my card hasn't really been flashed, it's just the buggy update software.

Apparently, other Gigabyte owners are flashing their cards with no issues, so it's either something in my machine, or maybe just the specific 3080 Vision OC F3 BIOS that's messed up. Unlike flashing through NVflash, the Gigabyte software won't allow you to flash any other BIOS version that isn't directly compatible with the one you have. So I'll either have to flash my Vision OC with some other BIOS (say, Gaming OC BIOS) through NVflash, and then try updating the card this way through Gigabyte's software, or I'll just wait for the correct Resizable BAR version of my BIOS to show up in the BIOS database and flash it through NVflash. I also do not know how to extract the BIOS file from Gigabyte's executable to be able to upload it straight to my card through NVflash, othwerwise I would just do that.

If anyone knows a workaround for this problem, please give me a heads up.


----------



## Nizzen

cstkl1 said:


> F1 2020 , high, australia |dry |cycle
> 11900k+3080 strix both stock
> disable 360fps
> enabled 260fps


This game is on the Bar supported list.
Strange


*GeForce RTX 30 Series Performance Accelerates With Resizable BAR Support*
By Andrew Burnes on March 30, 2021 | Featured StoriesGeForce RTX GPUsHardwareLaptops
Resizable BAR utilizes an advanced feature of PCI Express to increase performance in certain games. As of March 30th, 2021, Resizable BAR is supported for GeForce RTX 30 Series graphics cards and laptops.
For desktops to take advantage of Resizable BAR, users need a GeForce RTX 30 Series graphics card with a supported VBIOS, a compatible CPU, compatible motherboard, motherboard SBIOS update, and our newest GeForce Game Ready driver.
For GeForce RTX 30 Series laptops, please check with the manufacturer to see if Resizable BAR is supported on a specific model.
For instructions on how to enable Resizable BAR, continue reading. 

*What Is Resizable BAR?*
Resizable BAR is an optional PCI Express interface technology. As you move through a world in a game, GPU memory (VRAM) constantly transfers textures, shaders and geometry via many small CPU to GPU transfers.
With the ever-growing size of modern game assets, this results in a _lot_ of transfers. Using Resizable BAR, assets can instead be requested as-needed and sent in full, so the CPU can efficiently access the entire frame buffer. And if multiple requests are made, transfers can occur concurrently, rather than queuing. 



*Installation Steps*
To successfully activate Resizable BAR on a desktop PC, please follow these steps in order, referring to the expanded installation instructions below:

Confirm you have a compatible CPU & CPU chipset (see list below)
Confirm you have a compatible motherboard (see list below)
Update your motherboard SBIOS, if required, by installing an update from the manufacturer. Then enable Resizable BAR support in your motherboard’s BIOS interface
Update to the latest GeForce Game Ready Driver (version 465.89 WHQL at the time of writing, released March 30th, 2021), or a later version
If you have a GeForce RTX 3060, you’re good to go. If you have a GeForce RTX 3060 Ti, 3070, 3080, or 3090, then you may require an updated VBIOS
If you have a Founders Edition graphics card from NVIDIA, get your VBIOS update tool directly from our website. If you have a custom partner card, get the update tool from their site (see list below)
Verify Resizable BAR is enabled in the NVIDIA Control Panel (see instructions below)

*What CPUs and CPU Platforms Support Resizable BAR?*
Resizable BAR requires CPU and motherboard compatibility, and a Resizable BAR SBIOS update. As of March 30th, 2021, the following CPU chipsets and CPUs officially support Resizable BAR on GeForce RTX 30 Series desktop GPUs:

*Desktop CPU and Chipset Support*


*AMD Chipsets*​AMD 400 Series (on motherboards with AMD Zen 3 Ryzen 5xxx CPU support)AMD 500 Series

*AMD CPUs*​*AMD Zen 3 CPUs*Ryzen 3 5xxxRyzen 5 5xxxRyzen 7 5xxxRyzen 9 5xxx

*Intel Chipsets*​*Intel 10th Gen*Z490H470B460H410*Intel 11th Gen S*All 11th Gen chipsets available as of March 30th, 2021

*Intel CPUs*​*Intel 10th Gen**Intel 11th Gen S-Series*i9-10xxx CPUsi9-11xxx CPUsi7-10xxx CPUsi7-11xxx CPUsi5-10xxx CPUsi5-11xxx CPUsi3-10xxx CPUs
*Motherboard Support*
NVIDIA is working with motherboard manufacturers around the world to bring Resizable BAR support to compatible products. As of March 30th, 2021, the following manufacturers are offering SBIOS updates for select motherboards to enable Resizable BAR with GeForce RTX 30 Series desktop graphics cards:


*Motherboard Manufacturers Supporting Resizable BAR*​ASUSASRockCOLORFULEVGAGIGABYTEMSI
Head to the manufacturers’ website to discover whether your motherboard is compatible, and to download and install updates if available. Additionally, please refer to the manufacturer’s documentation for enabling Resizable BAR support in the BIOS after installation of new firmware.
Please note, some motherboard manufacturers have unofficially extended Resizable BAR support to prior generation products. Your mileage may vary utilizing these solutions.

*Enabling Resizable BAR On Your GeForce RTX 30 Series Graphics Card*
Having updated your motherboard, you’ll need to update your GPU, and download and install a GeForce Game Ready Driver, released March 30th, 2021, or later.

*Game Ready Driver Update*
In practice, the performance benefits of Resizable BAR can vary substantially from game to game. In our testing, we’ve found some titles benefit from a few percent, up to 12%. However, there are also titles that see a decrease in performance, so NVIDIA will be pre-testing titles and using game profiles to enable Resizable BAR only when it has a positive performance impact. That way you won’t have to worry about bugs or performance decreases, and won’t have to rely on the community to benchmark each title and discover whether Resizable BAR is beneficial in the games you’re playing.













_Popular, graphically demanding titles can run faster and smoother with Resizable BAR, further improving your experience_
With the release of our GeForce Game Ready Driver on March 30th, 2021, we’re enabling Resizable BAR support on all GeForce RTX 30 Series GPUs, and expanding support to an additional 9 games, for a total of 17:


*GeForce RTX 30 Series Resizable BAR Supported Games*​As of March 30th, 2021​Assassin's Creed ValhallaBattlefield VBorderlands 3ControlCyberpunk 2077Death StrandingDIRT 5F1 2020Forza Horizon 4Gears 5GodfallHitman 2Hitman 3Horizon Zero DawnMetro ExodusRed Dead Redemption 2Watch Dogs Legion


----------



## zayd

I've just updated my RTX 3090 Suprim X to the latest VBIOS which enable resizable bar on my Gigabyte Aorus Ultra Z390. Yes, I said that right, Z390 using an i9 9900K. Nvdia said that this only applies to the newer 10th and 11th gen Intel CPU's, but thats crap, as my motherboard had a resizable bar option added months ago, when I was still rocking my 5700XT. I have verified that this is now working, as I have the large memory range listed in device manager for the display adapter. Woo hoo, for one more FPS during gaming!


----------



## phoenixyz

zayd said:


> I've just updated my RTX 3090 Suprim X to the latest VBIOS which enable resizable bar on my Gigabyte Aorus Ultra Z390. Yes, I said that right, Z390 using an i9 9900K. Nvdia said that this only applies to the newer 10th and 11th gen Intel CPU's, but thats crap, as my motherboard had a resizable bar option added months ago, when I was still rocking my 5700XT. I have verified that this is now working, as I have the large memory range listed in device manager for the display adapter. Woo hoo, for one more FPS during gaming!


I am using a strix bios on my msi gaming x. i cant just do a direct vbios update. i want to flash an updated strix bios with rebar support but i am worried it might screw things up since it is an msi card using an older strix bios.


----------



## mouacyk

zayd said:


> I've just updated my RTX 3090 Suprim X to the latest VBIOS which enable resizable bar on my Gigabyte Aorus Ultra Z390. Yes, I said that right, Z390 using an i9 9900K. Nvdia said that this only applies to the newer 10th and 11th gen Intel CPU's, but thats crap, as my motherboard had a resizable bar option added months ago, when I was still rocking my 5700XT. I have verified that this is now working, as I have the large memory range listed in device manager for the display adapter. Woo hoo, for one more FPS during gaming!


By now, I think people realize that NVidia means they will only support ReBAR issues with 10th and 11th gen Intel CPUs.


----------



## KHUNGOLF

Broder said:


> I have a Gigabyte Vision OC RTX 3080, the Resizable BAR BIOS update seems broken for me.
> 
> The first thing that caught my eye was the sheer number of different BIOS versions for the RTX 3080 Vision OC, there are 6 in total (F3, F12, F21, F31, F51 and F80). A note below states:
> 
> Please note:
> You can only update to a VBIOS version of the same series.Visual C++2008 is required for this BIOS flash.
> If your VBIOS version is:
> F1, it can only be updated with VBIOS versions F2-F9.
> F10, it can only be updated with VBIOS versions F11-F19.
> F20, it can only be updated with VBIOS versions F21-F29.
> 
> So great, I get into GPU-Z to check my BIOS version, and this is what I find:
> BIOS Version 94.02.26.80.3C
> 
> So I don't know what BIOS version I have. As it turns out, you must download Gigabyte AORUS Engine to find out your BIOS version (but they don't state that anywhere in the website), mine was F1. So for me, the update is the F3 BIOS.
> 
> I downloaded the new F3 BIOS (Resizable BAR) from the website, extracted the zip file. Once I try opening the N3080VO.F3.exe file, after going through Windows safety protocols, a window asks "Are you sure you update your Graphics BIOS?" (this isn't a typo), I click "Yes" and get the following message:
> 
> Windows cannot find 'C:\Users\xxx\AppData\Local\Temp\N3080VO.F3.exe'. Make sure you typed the name correctly, and then try again.
> 
> So I tried to manually extract all the files directly to the mentioned folder, this eliminates the error message, but the application gets stuck in "Are you sure you update your Graphics BIOS?" window. I click on "Yes", nothing happens and the window just comes back on, asking me the same thing. So I click "No" to get out and, eerily, I get a "Flash BIOS success ! Would you reboot the system ?" (again, no typo). I can click "Yes" to reboot my system, but obviously, my card hasn't really been flashed, it's just the buggy update software.
> 
> Apparently, other Gigabyte owners are flashing their cards with no issues, so it's either something in my machine, or maybe just the specific 3080 Vision OC F3 BIOS that's messed up. Unlike flashing through NVflash, the Gigabyte software won't allow you to flash any other BIOS version that isn't directly compatible with the one you have. So I'll either have to flash my Vision OC with some other BIOS (say, Gaming OC BIOS) through NVflash, and then try updating the card this way through Gigabyte's software, or I'll just wait for the correct Resizable BAR version of my BIOS to show up in the BIOS database and flash it through NVflash. I also do not know how to extract the BIOS file from Gigabyte's executable to be able to upload it straight to my card through NVflash, othwerwise I would just do that.
> 
> If anyone knows a workaround for this problem, please give me a heads up.


unzipping the x64 folders using *"buildforge" *as the password and running the extracted .exe files. *😀*

Asus TUF RTX 3080 Flash to GIGABYTE AORUS MASTER RTX 3080 F4 Bios. Everything works fine.
*







*


----------



## emil2424

Glottis said:


> I've seen seen some reports that BAR BIOS lowered TGP / OC potential of some cards. Anyone here noticed that behavior?


I can confirm. I have RTX 3090. 
TGP before 385-395.
TGP after 350-362.
The GPU clock dropped by about 100-120 MHz.


----------



## Broder

It seems a lot of people are claiming lower power levels after the ReBAR BIOS update. Apparently, vendors are using the opportunity to silently decrease the power levels of their cards. Maybe an attempt to control the elevated GDDR6X temps? Either way, this seems like a bummer, losing 100-120Mhz core clock speeds in average seems worse overall than any possible performance gains ReBAR might give in the handful of games they actually work with.


----------



## erasmo0284

Hi, Guys, I'm new in the community. I have an MSI RTX 3080 Gaming Trio and I would like to "shunt mod" and Flash the Bios to make the most from the card. I would like to hear from you if it worth it.
I'm building a water loop with big radiators to keep the temperature down. I would love people to advise me. What's the performance gain if I do both things "shunt mod" and "Flash Bios"?
Do you guys know about a Flash tutorial?
Please I would like to hear an opinion from people who know about overclocking and who like to overclock. Thank you People


----------



## BluePaint

U can flash 450W bios on trio. Shunting will give u minimal performance on top. Maybe 1 or 2% in games


----------



## erasmo0284

BluePaint said:


> U can flash 450W bios on trio. Shunting will give u minimal performance on top. Maybe 1 or 2% in games


Thank you, Brother. Do you know about any tutorial?


----------



## cstkl1

F1 2020 High , TAA Checkered, AF None, Australia, Dry, Cam:Cycle : 1080p
Stock Cpu, Stock Strix 3080, 3866C14 1:1

Rebar = 269 FPS


Spoiler






Rebar Disabled = 388


Spoiler


----------



## mouacyk

^^ ouch


----------



## edhutner

NVIDIA Enables GPU Passthrough for Virtual Machines on Consumer-Grade GeForce GPUs


Editor's note: This is not a part of April Fools. NVIDIA has separated professional users and regular gamers with the company's graphics card offering. There is a GeForce lineup of GPUs, which represents a gaming-oriented version and its main task is to simply play games, display graphics, and...




www.techpowerup.com




probably 1st april joke 
or not ?!





GeForce GPU Passthrough for Windows Virtual Machine (Beta) | NVIDIA







nvidia.custhelp.com


----------



## marashz

edhutner said:


> NVIDIA Enables GPU Passthrough for Virtual Machines on Consumer-Grade GeForce GPUs
> 
> 
> Editor's note: This is not a part of April Fools. NVIDIA has separated professional users and regular gamers with the company's graphics card offering. There is a GeForce lineup of GPUs, which represents a gaming-oriented version and its main task is to simply play games, display graphics, and...
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> probably 1st april joke
> or not ?!
> 
> 
> 
> 
> 
> GeForce GPU Passthrough for Windows Virtual Machine (Beta) | NVIDIA
> 
> 
> 
> 
> 
> 
> 
> nvidia.custhelp.com











NVIDIA enables GPU passthrough for virtual machines on GeForce GPUs - VideoCardz.com


NVIDIA enables GeForce GPU passthrough for Windows virtual machines Linux users can now play Windows games through Virtual Machines. NVIDIA will now fully support GeForce GPU passthrough, a technology that enables access to GPU on a host machine from the virtual machine environment. This...




videocardz.com





Was posted 30th of March. Should be true


----------



## edhutner

Yep, but as far I understand its for linux host and windows guest


----------



## Imprezzion

Guys, should I get the EVGA Hybrid (FTW3 because RGB) kit for my card? It isn't a EVGA card but the block mount and VRAM plate seems pretty universal and back on the GTX1xxx and RTX2xxx it would fit with minor modifications on basically any model.. I would've put my Kraken X52 + G12 on it but it doesn't have the right hole spacing and I kinda wanna keep that for my 2080Ti.

I also have tons of copper VRAM and VRM headsinks and new thermal tape can be bought so.. as long as the block itself fits I can make it work.


----------



## Glottis

Reading 3090 thread they have 390W BIOS for 2x8pin cards so it's obvious our 3080 BIOSes are lower wattage not because of safety, they just want to make 3x8pin cards look better. It's frustrating when I know my TUF cooler could easily handle 400W or even 450W BIOS.


----------



## mouacyk

Glottis said:


> Reading 3090 thread they have 390W BIOS for 2x8pin cards so it's obvious our 3080 BIOSes are lower wattage not because of safety, they just want to make 3x8pin cards look better. It's frustrating when I know my TUF cooler could easily handle 400W or even 450W BIOS.


Tell that to all the Ampere/RDNA2 haters who have never seen a 300W GPU before. I completely agree with you -- even got all the materials for the Easy Shunt Mod months ago, but stopped because there are too many inconsistent results. The only other option now is to solder, but with no GPU backup, it's a no go.


----------



## Imprezzion

mouacyk said:


> Tell that to all the Ampere/RDNA2 haters who have never seen a 300W GPU before. I completely agree with you -- even got all the materials for the Easy Shunt Mod months ago, but stopped because there are too many inconsistent results. The only other option now is to solder, but with no GPU backup, it's a no go.


I'm in the same boat as you. Got all the supplies and soldering station but with no backup cards for normal prices I won't risk it. 

Actually, there's plenty of 3080's in stock here locally at webshops. Problem is, the cheapest one is €1899 and most proper 3x8 models are €2300-2400...


----------



## Falkentyne

mouacyk said:


> Tell that to all the Ampere/RDNA2 haters who have never seen a 300W GPU before. I completely agree with you -- even got all the materials for the Easy Shunt Mod months ago, but stopped because there are too many inconsistent results. The only other option now is to solder, but with no GPU backup, it's a no go.





Imprezzion said:


> I'm in the same boat as you. Got all the supplies and soldering station but with no backup cards for normal prices I won't risk it.
> 
> Actually, there's plenty of 3080's in stock here locally at webshops. Problem is, the cheapest one is €1899 and most proper 3x8 models are €2300-2400...


The solder stacking shunt mod is safe. Even I could do it. Just make sure you have a good temp regulated soldering iron like a TS-100 or better. Something at least 65W. And then buy some 3M High temp Polyimide Kapton tape. Then just tape completely around all of the shunts (clean with alcohol first). And don't skimp out and buy some value kapton tape. If you can afford a 3080/3090 you can afford 3M polyimide tape. It's actually harder to use conductive paint rather than solder because the paint is messy, even when you use Super 33+ tape around the shunts (When applying paint)--it's still messy. 

If your shunts are 1 watt shunts, it's even easier to shunt mod those than 2W shunts (Gigabyte and Nvidia use 2W shunts with depressed edges, those take some extra work to mod. Asus and eVGA and MSI use 1W shunts). Remember to use Rosin Flux and to -heat- the conductive edges of the shunts so that it gets hot enough for the solder to flow onto it to build your 'bridge'. Solder won't flow to cold surfaces and you won't burn up the shunts by having the iron touch it (this is the first thing people are afraid of).

The main issue with stacking is making sure the shunts are flat, there's no solder spikes or balls on top of the shunts, and the stacked shunt isn't touching the backplate. If you think it's going to be close, put some Kapton tape (Super 33+ tape isn't good enough, it can tear under pressure from heat and sharp solder daggers) above the shunt on the backplate and test for contact.

The only hard soldering method is desoldering the original shunts to apply 3 mOhm replaced shunts. That's hard. Mainly due to how the PCB literally absorbs all of the heat from the iron...you need a REALLY good iron, or a hot air station or an oven to preheat the board for that type of mod.


----------



## mouacyk

Mine is a Gigabyte, with all the depressed shunts. It's going to take extra solder and some skills to get a good contact without introducing resistance.


----------



## erasmo0284

Falkentyne said:


> The solder stacking shunt mod is safe. Even I could do it. Just make sure you have a good temp regulated soldering iron like a TS-100 or better. Something at least 65W. And then buy some 3M High temp Polyimide Kapton tape. Then just tape completely around all of the shunts (clean with alcohol first). And don't skimp out and buy some value kapton tape. If you can afford a 3080/3090 you can afford 3M polyimide tape. It's actually harder to use conductive paint rather than solder because the paint is messy, even when you use Super 33+ tape around the shunts (When applying paint)--it's still messy.
> 
> If your shunts are 1 watt shunts, it's even easier to shunt mod those than 2W shunts (Gigabyte and Nvidia use 2W shunts with depressed edges, those take some extra work to mod. Asus and eVGA and MSI use 1W shunts). Remember to use Rosin Flux and to -heat- the conductive edges of the shunts so that it gets hot enough for the solder to flow onto it to build your 'bridge'. Solder won't flow to cold surfaces and you won't burn up the shunts by having the iron touch it (this is the first thing people are afraid of).
> 
> The main issue with stacking is making sure the shunts are flat, there's no solder spikes or balls on top of the shunts, and the stacked shunt isn't touching the backplate. If you think it's going to be close, put some Kapton tape (Super 33+ tape isn't good enough, it can tear under pressure from heat and sharp solder daggers) above the shunt on the backplate and test for contact.
> 
> The only hard soldering method is desoldering the original shunts to apply 3 mOhm replaced shunts. That's hard. Mainly due to how the PCB literally absorbs all of the heat from the iron...you need a REALLY good iron, or a hot air station or an oven to preheat the board for that type of mod.


What's coming next after Shunts moded?


----------



## Broder

Glottis said:


> Reading 3090 thread they have 390W BIOS for 2x8pin cards so it's obvious our 3080 BIOSes are lower wattage not because of safety, they just want to make 3x8pin cards look better. It's frustrating when I know my TUF cooler could easily handle 400W or even 450W BIOS.


Having a 390W BIOS doesn't mean it can actually reach a consistent 390W. My Gigabyte 3080 stabilizes between 350-360W (closer to 350W), despite it having a 370W BIOS. The exact same thing happens with Asus 375W BIOS cards. And no, it's not thermal throttling (card is always below 64C, even on hot days). The absolute maximum rating for 2x8pin is 375W (150Wx2 + 75W PCIe), but in practice, analyzing GPU-Z, I can see that only the 8-pin connectors get close to their rated 150W, the PCIe power is usually below 60W, that's why the card never reaches the rated 370W (if it does, it is a ridiculous spike, not a consistent 370W).

Now I know that's old monkey talk, since I was the owner of a R9 295X2, and that card would push 600W off a 2x8-pin grid. But modern Nvidia cards don't work in this fashion, they rigorously respect the maximum power ratings, and this is why we never see the designated power levels reach what they should (unless they are far below the ratings of the power setup, e.g. a 300W BIOS in a 2x8pin card). I don't know what happens after you shunt mod, but if you're not doing that, ~350Wish seems all you can get from 2x8pin right now.


----------



## Glottis

Broder said:


> Having a 390W BIOS doesn't mean it can actually reach a consistent 390W. My Gigabyte 3080 stabilizes between 350-360W (closer to 350W), despite it having a 370W BIOS. The exact same thing happens with Asus 375W BIOS cards. And no, it's not thermal throttling (card is always below 64C, even on hot days). The absolute maximum rating for 2x8pin is 375W (150Wx2 + 75W PCIe), but in practice, analyzing GPU-Z, I can see that only the 8-pin connectors get close to their rated 150W, the PCIe power is usually below 60W, that's why the card never reaches the rated 370W (if it does, it is a ridiculous spike, not a consistent 370W).
> 
> Now I know that's old monkey talk, since I was the owner of a R9 295X2, and that card would push 600W off a 2x8-pin grid. But modern Nvidia cards don't work in this fashion, they rigorously respect the maximum power ratings, and this is why we never see the designated power levels reach what they should (unless they are far below the ratings of the power setup, e.g. a 300W BIOS in a 2x8pin card). I don't know what happens after you shunt mod, but if you're not doing that, ~350Wish seems all you can get from 2x8pin right now.


Again, this is has nothing to do with safety or rigorously respecting power ratings. 3090 FE has 400W BIOS and it's a 2x8 pin card. We have nerfed BIOSes on our 2x8pin 3080 for 2 reasons only:
1. to make 3080 3x8pin cards look better and easier to upsell.
2. to make any 3090 look better and easier to upsell.

I also want to add that the fact that my 3080 TUF has 375W BIOS as advertised by Asus, yet never ever reaches that 375W is false advertising. I have Afterbuner always running and max spike I've seen was like 365W. It's running at only 340-350W 99.9999% of the time.


----------



## lmfodor

Here’s my ReBAR results with an ASUS TUF 3080 OC + 5900x1440 at 170FPS with FH4, everything in ultra / extreme. No additional OC, the standard 1815mhz. Memory TridenZ Neo 3800CL14

ON 167 FPS over 170 target 
OFF 158 FPS

For me it seems very well!


Sent from my iPhone using Tapatalk Pro


----------



## weleh

My Suprim X is constantly at 430W on synthetics. 


Pretty sure a higher PL could help at least there, for gaming I haven't seen it pull more than 350W.


----------



## Hirtle

I scored 13 678 in Port Royal


----------



## mouacyk

nam3less said:


> then use that toward a 4090 or 4080ti to put under water


At 1Kilowatts, water will be preferred. Adding to the pain of another unobtainable card:


----------



## edhutner

Loool


----------



## Broder

Glottis said:


> Again, this is has nothing to do with safety or rigorously respecting power ratings. 3090 FE has 400W BIOS and it's a 2x8 pin card. We have nerfed BIOSes on our 2x8pin 3080 for 2 reasons only:
> 1. to make 3080 3x8pin cards look better and easier to upsell.
> 2. to make any 3090 look better and easier to upsell.
> 
> I also want to add that the fact that my 3080 TUF has 375W BIOS as advertised by Asus, yet never ever reaches that 375W is false advertising. I have Afterbuner always running and max spike I've seen was like 365W. It's running at only 340-350W 99.9999% of the time.


If they want to sell more of the more expensive 3x8pin models, they should be doing exactly the opposite, advertised LOWER BIOS power (this is what's going to convince people to buy the more expensive models) and not advertise HIGHER values for the 2x8pin models.

Currently, the only thing I'm worried about are the claims of the new ReBAR BIOS reducing the PL of the cards. I have seen quite a few people complaining that their overall performance was reduced with the ReBAR BIOS, claiming the power readings in GPU-Z are lower and that the sustained clocks are reduced in +100Mhz.


----------



## Imprezzion

I can test it with my Gigabyte Gaming OC if I can source a compatible ReBAR BIOS. Doesn't have to be Gigabyte BIOS either, it takes EVGA XC3 and ASUS TUF BIOS as well with no issues.


----------



## Falkentyne

Broder said:


> If they want to sell more of the more expensive 3x8pin models, they should be doing exactly the opposite, advertised LOWER BIOS power (this is what's going to convince people to buy the more expensive models) and not advertise HIGHER values for the 2x8pin models.
> 
> Currently, the only thing I'm worried about are the claims of the new ReBAR BIOS reducing the PL of the cards. I have seen quite a few people complaining that their overall performance was reduced with the ReBAR BIOS, claiming the power readings in GPU-Z are lower and that the sustained clocks are reduced in +100Mhz.


The cards that patch the BIOS rather than updating the BIOS doesn't change the power limit.
The FE cards patch doesn't even update the BIOS. The BIOS version remains identical to before.


----------



## Broder

Imprezzion said:


> I can test it with my Gigabyte Gaming OC if I can source a compatible ReBAR BIOS. Doesn't have to be Gigabyte BIOS either, it takes EVGA XC3 and ASUS TUF BIOS as well with no issues.


I'm looking to test it myself with the Vision OC, but the official Gigabyte BIOS update is broken for me, so I'm going to have to use NVflash to update my card. I'm waiting for verified ReBAR BIOSes to show up on the database. Because my Asus Z370 board didn't reciever ReBAR update, I'm not in a hurry to update my card. On a side note, Asus is the only manufacturer that hasn't updated their Z300 motherboards, and I happen to own two Asus Z370 boards and one Gigabyte Z370 board, so I only have one motherboard at home that is ReBAR capable, the problem is that my Gigabyte mobo is going to be paired with a 3060 Ti, so I'll definitely think twice in the future before I buy an Asus board again. If Gigabyte ReBAR BIOS decreases the PL, I'm going to have to try Asus or EVGA BIOS, or maybe I'll just stick with the pre-ReBAR BIOS until Asus decides to update their Z370 motherboards (and that seems unlikely to happen) or until I actually change my motherboard to one that's ReBAR capable.



Falkentyne said:


> The cards that patch the BIOS rather than updating the BIOS doesn't change the power limit.
> The FE cards patch doesn't even update the BIOS. The BIOS version remains identical to before.


I'm not sure that's how it works. With Gigabyte's update, for example, the BIOS code remains exactly the same after the ReBAR patch, so people have no way of knowing if the BIOS has ReBAR just by checking the BIOS code, they have to check the driver stats to see if ReBAR is enabled in the system. Despite that, the PL was decreased on some cards. And remember, I am talking about effective PL, not advertised PL. Since many modern 3080 and 3090 cards do NOT reach their advertised PL. An example is that both Gigabyte and Asus 3080 cards will cap at around 350W, despite their BIOS being advertised at 370 and 375W. With the ReBAR update, it seems as if the powercap is even lowered. I have seen complaints of 3090 owners that originally had a 390W power cap drop to 350W after the ReBAR update, that's a 40W drop with the new BIOS, it's a massive decrease. I'm not sure how bad the situation is for the 3080, since I haven't seen anyone report in actual values (just people complaining of lower 3DMark scores after the ReBAR update), so I guess I'm going to have to check it myself to see if, and how badly, the PL is decreased.


----------



## Falkentyne

Broder said:


> I'm looking to test it myself with the Vision OC, but the official Gigabyte BIOS update is broken for me, so I'm going to have to use NVflash to update my card. I'm waiting for verified ReBAR BIOSes to show up on the database. Because my Asus Z370 board didn't reciever ReBAR update, I'm not in a hurry to update my card. On a side note, Asus is the only manufacturer that hasn't updated their Z300 motherboards, and I happen to own two Asus Z370 boards and one Gigabyte Z370 board, so I only have one motherboard at home that is ReBAR capable, the problem is that my Gigabyte mobo is going to be paired with a 3060 Ti, so I'll definitely think twice in the future before I buy an Asus board again. If Gigabyte ReBAR BIOS decreases the PL, I'm going to have to try Asus or EVGA BIOS, or maybe I'll just stick with the pre-ReBAR BIOS until Asus decides to update their Z370 motherboards (and that seems unlikely to happen) or until I actually change my motherboard to one that's ReBAR capable.
> 
> 
> I'm not sure that's how it works. With Gigabyte's update, for example, the BIOS code remains exactly the same after the ReBAR patch, so people have no way of knowing if the BIOS has ReBAR just by checking the BIOS code, they have to check the driver stats to see if ReBAR is enabled in the system. Despite that, the PL was decreased on some cards. And remember, I am talking about effective PL, not advertised PL. Since many modern 3080 and 3090 cards do NOT reach their advertised PL. An example is that both Gigabyte and Asus 3080 cards will cap at around 350W, despite their BIOS being advertised at 370 and 375W. With the ReBAR update, it seems as if the powercap is even lowered. I have seen complaints of 3090 owners that originally had a 390W power cap drop to 350W after the ReBAR update, that's a 40W drop with the new BIOS, it's a massive decrease. I'm not sure how bad the situation is for the 3080, since I haven't seen anyone report in actual values (just people complaining of lower 3DMark scores after the ReBAR update), so I guess I'm going to have to check it myself to see if, and how badly, the PL is decreased.


You can check by just looking at the GPU-Z Bios version before and after the Rebar patch. If it's the exact same bios version then the power limit could not be changed since rebar itself has no access to power limits. That also means that disabling Rebar in the BIOS in that case would give you the same PL as before patch (since again the bios version did not change).

Manufacturers can not just make a new BIOS anytime they want. It has to actually be certified and "signed" by Nvidia directly. (Aka HULK or Cert 3.0).


----------



## weleh

On the Suprim X, everything looks normal with the REBAR VBIOS.


----------



## edhutner

On suprim x, did you update in both bios positions ("slient" and "gaming"), or only one?


----------



## weleh

Only updated in gaming position but I believe it updates both bioses when you do the MSI LIVE update way since nothing changed.


----------



## edhutner

Is there any official way to rollback if I am not satisfied 
I have made a backup of my current bios using gpuz, I guess it will be able to flash with nvflash.


----------



## Broder

Falkentyne said:


> You can check by just looking at the GPU-Z Bios version before and after the Rebar patch. If it's the exact same bios version then the power limit could not be changed since rebar itself has no access to power limits. That also means that disabling Rebar in the BIOS in that case would give you the same PL as before patch (since again the bios version did not change).
> 
> Manufacturers can not just make a new BIOS anytime they want. It has to actually be certified and "signed" by Nvidia directly. (Aka HULK or Cert 3.0).


So, here goes:








BIOS version has not changed. F21 (no Rebar) and F22 (Rebar) are both 92.02.26.48.70

Performance in 3DMark Time Spy went down from 18200 from 17800, these are the words from the card's owner:

"I managed to update VBIOS on 3080 Vision OC from F20 to F21, resize BAR is enabled, and... my Maximum performances went down.

For example - even something like 3dMark TimeSpy with not suppose to be affected in any way by Resize BAR functionality - with the same setup in Afterburner as under previous VBIOS- fall from around 18200 g-score to only 17800... "

3DMark is not ReBAR compatible, so we know ReBAR on or off should have no effect on the 3DMark score. Thus, if his performance went down, this proves that the PL was decreased with the ReBAR BIOS (F22), since nothing else in the system was changed. And this is not an isolated complaint, I have seen many others claiming the exact same problem, and from other vendors (such as MSI), so this seems to be a widespread issue for GA102 cards (GA104 cards don't seem to be affected by this problem, I can attest because I also have a 3060 Ti that didn't receive a PL downgrade with the ReBAR BIOS).

I have the exact same card model (3080 Vision OC) and I'm willing to test the ReBAR BIOS myself and verify the actual power levels during benchmarking, but first I need to get my hands on a verified version of the ReBAR BIOS, since Gigabyte's official BIOS updater is broken on my PC, and my 3080 card is too big to fit my other build (which is mini ITX), so I can't just stick my 3080 there to get the update to work in there.


----------



## Falkentyne

Broder said:


> So, here goes:
> View attachment 2485075
> 
> BIOS version has not changed. F21 (no Rebar) and F22 (Rebar) are both 92.02.26.48.70
> 
> Performance in 3DMark Time Spy went down from 18200 from 17800, these are the words from the card's owner:
> 
> "I managed to update VBIOS on 3080 Vision OC from F20 to F21, resize BAR is enabled, and... my Maximum performances went down.
> 
> For example - even something like 3dMark TimeSpy with not suppose to be affected in any way by Resize BAR functionality - with the same setup in Afterburner as under previous VBIOS- fall from around 18200 g-score to only 17800... "
> 
> 3DMark is not ReBAR compatible, so we know ReBAR on or off should have no effect on the 3DMark score. Thus, if his performance went down, this proves that the PL was decreased with the ReBAR BIOS (F22), since nothing else in the system was changed. And this is not an isolated complaint, I have seen many others claiming the exact same problem, and from other vendors (such as MSI), so this seems to be a widespread issue for GA102 cards (GA104 cards don't seem to be affected by this problem, I can attest because I also have a 3060 Ti that didn't receive a PL downgrade with the ReBAR BIOS).
> 
> I have the exact same card model (3080 Vision OC) and I'm willing to test the ReBAR BIOS myself and verify the actual power levels during benchmarking, but first I need to get my hands on a verified version of the ReBAR BIOS, since Gigabyte's official BIOS updater is broken on my PC, and my 3080 card is too big to fit my other build (which is mini ITX), so I can't just stick my 3080 there to get the update to work in there.


He updated the drivers. The drivers caused Timespy to go down. Notice the drivers are not the same? Has nothing to do with BAR.
Timespy went down. Port Royal went up. I lost 100 points in Timespy and I gained 250 points in Port Royal.


----------



## Broder

Falkentyne said:


> He updated the drivers. The drivers caused Timespy to go down. Notice the drivers are not the same? Has nothing to do with BAR.
> Timespy went down. Port Royal went up. I lost 100 points in Timespy and I gained 250 points in Port Royal.


The screenshot is from a different user, I just got it because the screenshot proves that both F20 and F21 BIOSes use the same identification number. And I never siad the performance loss has to do with Rebar, since 3DMark doesn´t even work with Rebar. It has to do with the fact AIBs are silently reducing the power limit with this new batch of ReBAR BIOS updates, and that´s crippling the performance across the board for a multitude of GA102 users.

Plus, driver is not the issue, by the contrary. The new driver is FASTER in 3DMark (my score went up from 12155 to 12387 in the Port Royal, and up from 18747 to 18821 in Time Spy, +175Mhz core and +750Mhz mem and same fan profiles). So the fact that his performance went DOWN, despite the fact that he is using new drivers (which are faster than the old drives) shows just how bad the Power Limit problem is.


----------



## Broder

Ok, so I just updated the BIOS to my Gigabyte 3080 Vision OC (thx KHUNGOLF for the tip). My card had the F1 BIOS, so I had to update to the F3 version (no clue why Gigabyte has 5 different BIOS variants for the same card model).

Unlike BIOS update F21-to-F22, F1-to-F3 DOES change the BIOS code (from 94.02.26.80.3C to 94.02.42.40.33). BIOS still rated at 370W.

No Power Limit loss observed. Performance seems (mostly) unaltered. Testing in 3DMark Port Royal maintains the same 350-360W maximum power levels observed. Port Royal score seems unaltered. Time Spy, on the other hand, seems to have dropped some 100 points. Not sure if this is caused by the new BIOS or if it's just something else (maybe a system update), I could roll back to the previous BIOS to recheck the Time Spy score, but at this point I don't think I'll be rolling back. I don't want to be flashing my card back-and-forth for obvious reasons. Time Spy Extreme also seems mostly unaltered (I did see a 25 point loss, but this is well within margin of error).

All in all, I'm glad I didn't get the power level drop issue, the original BIOS of my card was one of the most powerful BIOSes available for 2x8pin models in the market and I didn't feel like having to flash with some other vendors BIOS to be able to keep my previous levels of performance. Turns out I get to maintain the 370W performance (which, in reality, is more like 360W) with ReBAR capability (ironically, my motherboard still hasn't received the BIOS update, and this point it might never be receiving one) without having to steer away from the original BIOS that correctly matches my card.

The BIOS is available in TPU's database (though it's unverified, so use at your own risk) in case someone with a different model/brand card wants to test it out.


----------



## ducegt

Still waiting for Asus Z370 BIOS support for ReBAR (not certain I'll even bother with it), so I tested driver 461.72 vs 465.89 with my daily\stable overclock and the newer driver netted +195 in Port Royal. From 12,596 to 12,791.


----------



## Imprezzion

I kinda don't wanna update my motherboard BIOS as you can't save OC profiles across BIOS versions and it took me weeks to daily in this overclock... I won't be using rebar anyways. Don't play anything that benefits from it from initial benchmarks.


----------



## Falkentyne

Imprezzion said:


> I kinda don't wanna update my motherboard BIOS as you can't save OC profiles across BIOS versions and it took me weeks to daily in this overclock... I won't be using rebar anyways. Don't play anything that benefits from it from initial benchmarks.


It takes less than 5 minutes to re-do your settings.
It takes longer to just write them down on paper (or save them on your phone if it's RAM timings).
It's a lot less trouble than you think.


----------



## ducegt

My motherboard killed a NVMe drive this past year from memory training reboots and I can't just enter the same memory timings and presto...I also need some proof the Suprim rebar bios doesn't limit OC because my OC provides more gains than rebar. I do have a funky Z270 board flashed to Z370 so I'll have to wait for the new BIOS to get modded or do it myself which is always a fun rabbit hole to go down. Its not much effort, but if its something that's going to cause any issues, the benefit it provides just isn't significant.


----------



## weleh

What proof do you need? I have a Suprim X with RBAR VBIOS.


----------



## dhinge

Broder said:


> I'm looking to test it myself with the Vision OC, but the official Gigabyte BIOS update is broken for me, so I'm going to have to use NVflash to update my card. I'm waiting for verified ReBAR BIOSes to show up on the database. Because my Asus Z370 board didn't reciever ReBAR update, I'm not in a hurry to update my card. On a side note, Asus is the only manufacturer that hasn't updated their Z300 motherboards, and I happen to own two Asus Z370 boards and one Gigabyte Z370 board, so I only have one motherboard at home that is ReBAR capable, the problem is that my Gigabyte mobo is going to be paired with a 3060 Ti, so I'll definitely think twice in the future before I buy an Asus board again. If Gigabyte ReBAR BIOS decreases the PL, I'm going to have to try Asus or EVGA BIOS, or maybe I'll just stick with the pre-ReBAR BIOS until Asus decides to update their Z370 motherboards (and that seems unlikely to happen) or until I actually change my motherboard to one that's ReBAR capable.
> 
> 
> I'm not sure that's how it works. With Gigabyte's update, for example, the BIOS code remains exactly the same after the ReBAR patch, so people have no way of knowing if the BIOS has ReBAR just by checking the BIOS code, they have to check the driver stats to see if ReBAR is enabled in the system. Despite that, the PL was decreased on some cards. And remember, I am talking about effective PL, not advertised PL. Since many modern 3080 and 3090 cards do NOT reach their advertised PL. An example is that both Gigabyte and Asus 3080 cards will cap at around 350W, despite their BIOS being advertised at 370 and 375W. With the ReBAR update, it seems as if the powercap is even lowered. I have seen complaints of 3090 owners that originally had a 390W power cap drop to 350W after the ReBAR update, that's a 40W drop with the new BIOS, it's a massive decrease. I'm not sure how bad the situation is for the 3080, since I haven't seen anyone report in actual values (just people complaining of lower 3DMark scores after the ReBAR update), so I guess I'm going to have to check it myself to see if, and how badly, the PL is decreased.





Broder said:


> If they want to sell more of the more expensive 3x8pin models, they should be doing exactly the opposite, advertised LOWER BIOS power (this is what's going to convince people to buy the more expensive models) and not advertise HIGHER values for the 2x8pin models.
> 
> Currently, the only thing I'm worried about are the claims of the new ReBAR BIOS reducing the PL of the cards. I have seen quite a few people complaining that their overall performance was reduced with the ReBAR BIOS, claiming the power readings in GPU-Z are lower and that the sustained clocks are reduced in +100Mhz.


Just flashed the Asus Rog Strix OC bios on my rtx 3080.. no issues as such, same power consumption as earlier around 390 - 420 watts.

But seeing smoother gameplay and almost no stutters in warzone.


----------



## Clukos

New BIOS on the FE + changed thermal pads to thermalright odyssey, 13.334 port royal: I scored 13 334 in Port Royal

This must be one of the best FE scores in the database 

Edit: In fact I checked in the top 100 GPU score most of them are Asus Strix or FTW3 Ultras (both of which I think have 450W BIOS iirc)!


----------



## edhutner

Suprim X user here. Finally I took the step to upgrade to the rebar bios. MSI live update tool actually downloads exe file, which is self extracting and contains the rom file, nvflash tool and batch file that does the actual flash. My version reported in gpuz after the upgrade stays the same, but rebar is enabled.
Regarding the power limits - they stay the same. Timespy and portroyal scores (old vs new bios) are very similar, power usage too.


----------



## BluePaint

@Clukos
Well done, that's quite some score with 2120Mhz avg, esp with 5800X. Your VRAM seems to be the best around.
I wonder whether new driver has some improvements


----------



## SoldierRBT

New driver improves PR score by +250 points. Seems like the extra score comes from the memory since it uses more watts (MVDCC power draw). Haven’t tested ReBar enable vs disable with new driver on PR.

EDIT: Tested again with same settings. Score improved by 265 points. With new driver card hits PWR limits.
RTX 3080 FE 0.981v 2085MHz +1050 Mem
Before:








After:

















I scored 12 840 in Port Royal


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Clukos

Yup the new driver certainly improved RT perf, raster seems the same though.


----------



## dhinge

Broder said:


> So, here goes:
> View attachment 2485075
> 
> BIOS version has not changed. F21 (no Rebar) and F22 (Rebar) are both 92.02.26.48.70
> 
> Performance in 3DMark Time Spy went down from 18200 from 17800, these are the words from the card's owner:
> 
> "I managed to update VBIOS on 3080 Vision OC from F20 to F21, resize BAR is enabled, and... my Maximum performances went down.
> 
> For example - even something like 3dMark TimeSpy with not suppose to be affected in any way by Resize BAR functionality - with the same setup in Afterburner as under previous VBIOS- fall from around 18200 g-score to only 17800... "
> 
> 3DMark is not ReBAR compatible, so we know ReBAR on or off should have no effect on the 3DMark score. Thus, if his performance went down, this proves that the PL was decreased with the ReBAR BIOS (F22), since nothing else in the system was changed. And this is not an isolated complaint, I have seen many others claiming the exact same problem, and from other vendors (such as MSI), so this seems to be a widespread issue for GA102 cards (GA104 cards don't seem to be affected by this problem, I can attest because I also have a 3060 Ti that didn't receive a PL downgrade with the ReBAR BIOS).
> 
> I have the exact same card model (3080 Vision OC) and I'm willing to test the ReBAR BIOS myself and verify the actual power levels during benchmarking, but first I need to get my hands on a verified version of the ReBAR BIOS, since Gigabyte's official BIOS updater is broken on my PC, and my 3080 card is too big to fit my other build (which is mini ITX), so I can't just stick my 3080 there to get the update to work in there.


Use Nvflash..


----------



## Panchovix

Someone has tried other VBIOSes on the 3080 TUF to raise the power limit without issues? I'm sill impressed even at 0.975V I'm power limited with 375W lol


----------



## Glottis

Panchovix said:


> Someone has tried other VBIOSes on the 3080 TUF to raise the power limit without issues? I'm sill impressed even at 0.975V I'm power limited with 375W lol


Not worth messing with BIOSes on TUF, unless there is some new development and we get 400W+ BIOS for 2x8pin cards.


----------



## Panchovix

Glottis said:


> Not worth messing with BIOSes on TUF, unless there is some new development and we get 400W+ BIOS for 2x8pin cards.


Oof, thanks, gonna try after some time then. Just read someone with a 3090TUF (has 2x8 as well) flashed a Aorus Extreme VBIOS and it worked without much issues, wondering if it will be the same with the 3080.

Thanks!


----------



## mouacyk

Panchovix said:


> Oof, thanks, gonna try after some time then. Just read someone with a 3090TUF (has 2x8 as well) flashed a Aorus Extreme VBIOS and it worked without much issues, wondering if it will be the same with the 3080.
> 
> Thanks!


You can cross-flash any BIOS with pin compatibility, but be aware of possible lost outputs. @Imprezzion suggested that the XC3 BIOS has the best effective clocks for 2-pin cards from his trials.


----------



## Panchovix

mouacyk said:


> You can cross-flash any BIOS with pin compatibility, but be aware of possible lost outputs. @Imprezzion suggested that the XC3 BIOS has the best effective clocks for 2-pin cards from his trials.


That one does have lower max TDP right (366W vs 375W of the TUF, if I'm not wrong), so that's pretty interesting, better power delivery?


----------



## mouacyk

Panchovix said:


> That one does have lower max TDP right (366W vs 375W of the TUF, if I'm not wrong), so that's pretty interesting, better power delivery?


If I am to guess, I'd guess EVGA programmed a tighter VF curve into that BIOS. If your silicon happens to dance in step with it, without crashing, then it's a win. I have yet to try it myself, but anyone who recognizes the meaningfulness of the effective clock tends to know what they're doing with these cards.


----------



## Imprezzion

Both the 366 and 375w BIOS will limit to 345-350w anyway. That small drop doesn't matter.


----------



## Panchovix

Imprezzion said:


> Both the 366 and 375w BIOS will limit to 345-350w anyway. That small drop doesn't matter.


Oooh that's something I didn't know, thanks for the info.


----------



## marashz

My XC3 for some reason takes 20W less from 8pin #1 than #2... Maybe because of that it never reached 350W no matter what... Even tried TUF bios, still sits around 330W. Max max I ever saw was 346W peak...
Oh, my XC3 now sits without backplate. What real pros of using one (like less coil whine, better temps of w/e what...)?


http://imgur.com/a/uINnH9d

 <-- XC3 3080 pcb


----------



## Panchovix

By the way just by curiosity, the XOC VBIOS only exists for the RTX 3090 right? (The 1000W one without any limit)

I know some people flashed that on their RTX 3090 with 2x8 pin and it could draw near 500W lol

I remember I had a 2070S with 1x8 pin + 1x6 pin, and flashed the HoF VBIOS (to draw at most 330W) and it worked without issues, the PCI-E cables were def hotter though.




marashz said:


> My XC3 for some reason takes 20W less from 8pin #1 than #2... Maybe because of that it never reached 350W no matter what... Even tried TUF bios, still sits around 330W. Max max I ever saw was 346W peak...
> Oh, my XC3 now sits without backplate. What real pros of using one (like less coil whine, better temps of w/e what...)?
> 
> 
> http://imgur.com/a/uINnH9d
> 
> <-- XC3 3080 pcb


On my 3080 TUF I've never seen above 350W, even with the slider maxed


----------



## Glottis

Panchovix said:


> By the way just by curiosity, the XOC VBIOS only exists for the RTX 3090 right? (The 1000W one without any limit)
> 
> I know some people flashed that on their RTX 3090 with 2x8 pin and it could draw near 500W lol
> 
> I remember I had a 2070S with 1x8 pin + 1x6 pin, and flashed the HoF VBIOS (to draw at most 330W) and it worked without issues, the PCI-E cables were def hotter though.


Are you saying 3090 users successfully flashed 3x8pin BIOS onto 2x8pin cards? If so, why aren't 3080 users doing the same? I don't get it. Either you are wrong, or 3080 users are noobs who missed this obvious thing for 6 months.


----------



## mouacyk

You can flash 3-pin BIOS onto 2-pin 3080. Problem is power will downscale (2/3 = 66%), resulting in a total power level near 300W, which is even lower than default stock.


----------



## jura11

Glottis said:


> Are you saying 3090 users successfully flashed 3x8pin BIOS onto 2x8pin cards? If so, why aren't 3080 users doing the same? I don't get it. Either you are wrong, or 3080 users are noobs who missed this obvious thing for 6 months.


Only KPE XOC 1000W BIOS will work on 2*8-pin GPU, others 3*8-pin BIOS like FTW3 or Strix or Suprim won't work on 2*8-pin GPU as intended 

With XOC 1000W BIOS you will pulling something like 650-670W from 2*8-pin GPU 

I'm using this BIOS on my both RTX 3090 GamingPro's, just they're capped at 65-75% for gaming and for rendering they are capped at 65%

Hope this helps 

Thanks, Jura


----------



## mouacyk

jura11 said:


> Only KPE XOC 1000W BIOS will work on 2*8-pin GPU, others 3*8-pin BIOS like FTW3 or Strix or Suprim won't work on 2*8-pin GPU as intended
> 
> With XOC 1000W BIOS you will pulling something like 650-670W from 2*8-pin GPU
> 
> I'm using this BIOS on my both RTX 3090 GamingPro's, just they're capped at 65-75% for gaming and for rendering they are capped at 65%
> 
> Hope this helps
> 
> Thanks, Jura


You're in the 3080 thread. There is no XOC BIOS >500W for 3080, 3xpin or otherwise.


----------



## Panchovix

mouacyk said:


> You can flash 3-pin BIOS onto 2-pin 3080. Problem is power will downscale (2/3 = 66%), resulting in a total power level near 300W, which is even lower than default stock.


Ohh that explains some tests I did on my 2070S in the past, but it seems not all VBIOS suffer the same thing, because my 2070S had a 260W VBIOS, but flashing for example the Strix or the Aorus Master which had 330-326W limit, it didn't work (as you explain, I had power limit with lower power usage) but when flashing the HoF VBIOS (this one GALAX RTX 2070 Super VBIOS) did let me use all the 320W without issues, so wondering if it's something with some VBIOS or not. 

(There's no 3080 HoF right? lol)



jura11 said:


> Only KPE XOC 1000W BIOS will work on 2*8-pin GPU, others 3*8-pin BIOS like FTW3 or Strix or Suprim won't work on 2*8-pin GPU as intended
> 
> With XOC 1000W BIOS you will pulling something like 650-670W from 2*8-pin GPU
> 
> I'm using this BIOS on my both RTX 3090 GamingPro's, just they're capped at 65-75% for gaming and for rendering they are capped at 65%
> 
> Hope this helps
> 
> Thanks, Jura


Oh that explains it then, some VBIOS for 3x8 pin won't work well on 2x8pin except the XOC for the 3090, sad there is no XOC 1000W VBIOS for the 3080.


----------



## jura11

mouacyk said:


> You're in the 3080 thread. There is no XOC BIOS >500W for 3080, 3xpin or otherwise.


Yup I know and that what I said regarding the XOC BIOS on RTX 3090 

Same applies too for using 3*8-pin BIOS on 2*8-pin GPU, which you shouldn't do that or use that BIOS on 2*8-pin GPU 

Hope this helps 

Thanks, Jura


----------



## jura11

Panchovix said:


> Ohh that explains some tests I did on my 2070S in the past, but it seems not all VBIOS suffer the same thing, because my 2070S had a 260W VBIOS, but flashing for example the Strix or the Aorus Master which had 330-326W limit, it didn't work (as you explain, I had power limit with lower power usage) but when flashing the HoF VBIOS (this one GALAX RTX 2070 Super VBIOS) did let me use all the 320W without issues, so wondering if it's something with some VBIOS or not.
> 
> (There's no 3080 HoF right? lol)
> 
> 
> 
> Oh that explains it then, some VBIOS for 3x8 pin won't work well on 2x8pin except the XOC for the 3090, sad there is no XOC 1000W VBIOS for the 3080.


Hopefully EVGA or Galax or Asus release will release RTX 3080 XOC BIOS which would or should work on 2*8-pin GPU 

Until that only option is to shunt mod, wish I know how to do that then I'm already shunting my RTX 3090's 

Hope this helps 

Thanks, Jura


----------



## gabeomatic

Hey guys,
Managed to snag a 3080 trinity OC and wonder what the general consensus to setting this thing up right is for high clocks (I'll stay on air) Using the TUF OC bios? Undervolting? Might change the pads down the line (I won't be mining)

I did see someone was able to get 400+!? watts under the strix bios but that was watercooled of course at 50C - still pretty insane for a zotac haha. Is it true that the latest update for resizable bar from Zotac will in turn again gimp the power limits of the card? Looking for the best 360ish watt setup id possible What's the workaround? Thanks!

My last card was an EVGA FTW3 3070 but prior to that I had a golden ish 980Ti Amp Extreme which ran at 1500/8100 on air (almost 1080 reference) which I still use in a guest pc @ 1080p


----------



## Panchovix

TBH if someone finds any way to use more than 400W with a VBIOS in any 2x8 pin card without shunt modding, please tell me haha


----------



## Falkentyne

Panchovix said:


> TBH if someone finds any way to use more than 400W with a VBIOS in any 2x8 pin card without shunt modding, please tell me haha


Not possible at all.


----------



## fray_bentos

gabeomatic said:


> Hey guys,
> Managed to snag a 3080 trinity OC and wonder what the general consensus to setting this thing up right is for high clocks (I'll stay on air) Using the TUF OC bios? Undervolting? Might change the pads down the line (I won't be mining)
> 
> I did see someone was able to get 400+!? watts under the strix bios but that was watercooled of course at 50C - still pretty insane for a zotac haha. Is it true that the latest update for resizable bar from Zotac will in turn again gimp the power limits of the card? Looking for the best 360ish watt setup id possible What's the workaround? Thanks!
> 
> My last card was an EVGA FTW3 3070 but prior to that I had a golden ish 980Ti Amp Extreme which ran at 1500/8100 on air (almost 1080 reference) which I still use in a guest pc @ 1080p


I've previously overclocked all cards that I've owned, but this is what happened for me on 3080. I faffed with trying to overclock memory, gave up as it only added instability. I faffed with clock overclocking and gave up as it just added lots of extra heat and noise for no tangible/noticeable benefit in games (yes benchmark scores went up a bit). I settled on an undervolt capped 856 mV set with V/F using maximum stable frequency at that voltage. Set and forget.


----------



## blurp

fray_bentos said:


> I've previously overclocked all cards that I've owned, but this is what happened for me on 3080. I faffed with trying to overclock memory, gave up as it only added instability. I faffed with clock overclocking and gave up as it just added lots of extra heat and noise for no tangible/noticeable benefit in games (yes benchmark scores went up a bit). I settled on an undervolt capped 856 mV set with V/F using maximum stable frequency at that voltage. Set and forget.


Came to the same conclusion with 1875 @ 875 mv on my EVGA FTW3 3080. Great performance with fairly low heat & noice.


----------



## Panchovix

fray_bentos said:


> I've previously overclocked all cards that I've owned, but this is what happened for me on 3080. I faffed with trying to overclock memory, gave up as it only added instability. I faffed with clock overclocking and gave up as it just added lots of extra heat and noise for no tangible/noticeable benefit in games (yes benchmark scores went up a bit). I settled on an undervolt capped 856 mV set with V/F using maximum stable frequency at that voltage. Set and forget.


In my case with my 3080, overclocking the core works but depends heavily on the game, on my TUF I can reach 2130Mhz without much issues but depending of the game and load, for example on RTX games I get powerlimited as soon as I get more than 2040Mhz lol. Memory overclock I can do +1500Mhz without issues and without losing performance, the problem is, that without undervolting, I get powerlimited as well since overclocking GDDR6X uses ton more of power.

So my main issue atm is being power limited, at least the card undervolts well so I'm doing that for now lol


----------



## fray_bentos

Panchovix said:


> In my case with my 3080, overclocking the core works but depends heavily on the game, on my TUF I can reach 2130Mhz without much issues but depending of the game and load, for example on RTX games I get powerlimited as soon as I get more than 2040Mhz lol. Memory overclock I can do +1500Mhz without issues and without losing performance, the problem is, that without undervolting, I get powerlimited as well since overclocking GDDR6X uses ton more of power.
> 
> So my main issue atm is being power limited, at least the card undervolts well so I'm doing that for now lol


Yes, I see this too. I set the highest voltage where I don't see powerlimiting in 3DMark Port Royal and RTX games. I also have some presets set in Afterburner with different voltage caps depending on the load. Generally, I just leave it on the 856 mV cap, but I also have a low voltage 775 mV capped one, which I use when maxing out my monitor refresh rate and the extra power is not needed.


----------



## Imprezzion

Panchovix said:


> In my case with my 3080, overclocking the core works but depends heavily on the game, on my TUF I can reach 2130Mhz without much issues but depending of the game and load, for example on RTX games I get powerlimited as soon as I get more than 2040Mhz lol. Memory overclock I can do +1500Mhz without issues and without losing performance, the problem is, that without undervolting, I get powerlimited as well since overclocking GDDR6X uses ton more of power.
> 
> So my main issue atm is being power limited, at least the card undervolts well so I'm doing that for now lol


I can get as high as 2145 @ 1.100v on my Gigabyte Gaming OC stable but obviously it throttles being a 2x8pin card. Usually in heavy titles it averages at about 1990-2010 @ 0.956-0.962v. This is with +1400 memory and basically +105 core.


----------



## jim2point0

I won a custom watercooled PC that has an EVGA 3080 FTW3 Ultra in it. Sadly, GPU is still on air cooler, only the CPU is water cooled.

The case is mostly open. Here's a picture of me filling it with coolant:











So I tested it out last night for the first time. Wanted to see how far I could push the clocks. At max power, this card still reaches 77C (fans at 80%). Even with that beefy cooler. Shame the builder couldn't get the GPU on water but... it is what it is.

Here's a timespy extreme run. I scored 9 132 in Time Spy Extreme

Not going to top any charts with that, but it's still faster than my last system.


----------



## Panchovix

Imprezzion said:


> I can get as high as 2145 @ 1.100v on my Gigabyte Gaming OC stable but obviously it throttles being a 2x8pin card. Usually in heavy titles it averages at about 1990-2010 @ 0.956-0.962v. This is with +1400 memory and basically +105 core.


That's pretty nice, in which game/app are you testing the overclock to not throttle? I want to check how much is my max, but being 2x8 pin makes it a ton harder


----------



## Imprezzion

Panchovix said:


> That's pretty nice, in which game/app are you testing the overclock to not throttle? I want to check how much is my max, but being 2x8 pin makes it a ton harder


Battlefield 4 1080p 150% res scale all max is one of the tests, it stays just under the power limit, and Cyberpunk 2077 1080P RT Psycho with DLSS on Performance is as well. The higher you put the DLSS (so the lower the resolution) the less power draw it has. It looks horrible but is a great stability test as it does use full ray tracing and DLSS calculations and complex scenes.


----------



## Panchovix

Imprezzion said:


> Battlefield 4 1080p 150% res scale all max is one of the tests, it stays just under the power limit, and Cyberpunk 2077 1080P RT Psycho with DLSS on Performance is as well. The higher you put the DLSS (so the lower the resolution) the less power draw it has. It looks horrible but is a great stability test as it does use full ray tracing and DLSS calculations and complex scenes.


Many thanks man! Have both of these games, gonna try later.


----------



## Pedros

Anyone having "major" issues with the latest Nvidia drivers and ReBar active? Since I updated and activated Rebar I had the following issues ( mostly on Apex since it's the only game I play daily )

( This happens on default settings, not OC )


Random crashes to desktop
Textures get fixed on the screen and go with me while I walk around the map
Some bad artifact spiking happened twice
Hard crash, where the screen goes black and but some textures stay on the screen in blue and red colors.
Some BSODs where the reason is a file that starts with nv

I was kind of thinking "damn...my card is broke" ... but it seems lots of people are having issues with the drivers.

did 2 hours of testing using 3D Mark stability tests and everything was fine, both stock or OC'led.


One of the examples of what was my screen the other day ... perfect visibility 

This "artifact" was very dynamic, it kind of looked like the laser effects you see at the nightclubs or trance shows ... depending on where I looked it would change form and color, always with this "spiking effect" animation.











After a fresh windows install, DDU and NVClean Install ... even tried installing the default ( w/ experience app included ) because who knew there wasn't a catch ... No change. 

Ok, The artifacts don't show that often, but crashes are on a daily basis. 3 days ago I actually got a piece of a rock texture that fixed on the screen and I took it everywhere I was going ... lol, just weird


----------



## Jimmy2Shoes

Imprezzion said:


> Battlefield 4 1080p 150% res scale all max is one of the tests, it stays just under the power limit, and Cyberpunk 2077 1080P RT Psycho with DLSS on Performance is as well. The higher you put the DLSS (so the lower the resolution) the less power draw it has. It looks horrible but is a great stability test as it does use full ray tracing and DLSS calculations and complex scenes.


Hey Buddy, 

On the gaming OC when you opened it up was there a connector for the RGB. Reason I ask is I want to turn it off but RBG Fusion is the worst bit of software I have seen in a long time. I'm running my rig in a home cinema so trying to turn off all RGB. 

Thanks


----------



## Colonel_Klinck

Guys anyone know the thermal pad thickness on memory for the MSI Surpim X?


----------



## weleh

Never understood why the end user has to undervolt a GPU to make it work at decent temperatures and noise levels.

Sold my Suprim X 3080, bought a Toxic 6900XT. Never again am I going air cooler on a GPU. Ever...


----------



## Pedros

weleh said:


> Never understood why the end user has to undervolt a GPU to make it work at decent temperatures and noise levels.
> 
> Sold my Suprim X 3080, bought a Toxic 6900XT. Never again am I going air cooler on a GPU. Ever...


That's what I said when I got a 1080TI Seahawk X ... and then the pump died ...  Solution? Build a custom WC where you can control and swap the parts for yourself without relying on 3rd parties


----------



## Hiikeri

Colonel_Klinck said:


> Guys anyone know the thermal pad thickness on memory for the MSI Surpim X?


Suprim X. 

3080: 2mm frontside (memory) and 3mm to backside (between card <> backplate).


----------



## Colonel_Klinck

Hiikeri said:


> Suprim X.
> 
> 3080: 2mm frontside (memory) and 3mm to backside (between card <> backplate).


Thanks m8


----------



## Panchovix

Pedros said:


> Spoiler:  "Anyone having "major" issues with the latest Nvidia drivers and ReBar active? Since I updated and activated Rebar I had the following issues ( mostly on Apex since it's the only game I play daily )
> ( This happens on default settings, not OC )
> [LIST
> 
> 
> 
> [*]Random crashes to desktop
> [*]Textures get fixed on the screen and go with me while I walk around the map
> [*]Some bad artifact spiking happened twice
> [*]Hard crash, where the screen goes black and but some textures stay on the screen in blue and red colors.
> [*]Some BSODs where the reason is a file that starts with nv
> [/LIST]
> 
> I was kind of thinking "damn...my card is broke" ... but it seems lots of people are having issues with the drivers.
> 
> did 2 hours of testing using 3D Mark stability tests and everything was fine, both stock or OC'led.
> 
> 
> One of the examples of what was my screen the other day ... perfect visibility
> 
> This "artifact" was very dynamic, it kind of looked like the laser effects you see at the nightclubs or trance shows ... depending on where I looked it would change form and color, always with this "spiking effect" animation.
> 
> 
> 
> 
> After a fresh windows install, DDU and NVClean Install ... even tried installing the default ( w/ experience app included ) because who knew there wasn't a catch ... No change.
> 
> Ok, The artifacts don't show that often, but crashes are on a daily basis. 3 days ago I actually got a piece of a rock texture that fixed on the screen and I took it everywhere I was going ... lol, just weird"]


Haven't happened to me mate (I have ReBAR enabled as well), but I mostly play Warzone, Skyrim modded and Genshin Impact, maybe is an issue with Apex.




Jimmy2Shoes said:


> Hey Buddy,
> 
> On the gaming OC when you opened it up was there a connector for the RGB. Reason I ask is I want to turn it off but RBG Fusion is the worst bit of software I have seen in a long time. I'm running my rig in a home cinema so trying to turn off all RGB.
> 
> Thanks


Besides my TUF 3080, I have a Gigabyte Gaming OC PRO3060Ti, which effectively has a connector for the RGB and others for the fans, if that helps (it seems the cooler is the same one, just the 3080 one is more large and width)


----------



## Imprezzion

Yeah it does. I have the RGB logo removed from the shroud right now and have it double sided taped to the card lol. All the RGB has it's own plug.

Once my EVGA Hybrid FTW3 cooler shows up tomorrow or Thursday I will slap that on and see if I can keep the RGB logo somehow haha.

I am soo curious to see if the Hybrid FTW3 cooler fits properly on this Gaming OC because if it does it means it will fit any 3080 so people can effectively buy a AIO for their 3080's pretty easily. It ain't cheap tho. I spend €169 on it and will probably end up spending at least another €40 or so on VRAM and VRM heatsinks and decent thermal tape.. but in the end it will be SO worth it.


----------



## Falkentyne

Pedros said:


> Anyone having "major" issues with the latest Nvidia drivers and ReBar active? Since I updated and activated Rebar I had the following issues ( mostly on Apex since it's the only game I play daily )
> 
> ( This happens on default settings, not OC )
> 
> 
> Random crashes to desktop
> Textures get fixed on the screen and go with me while I walk around the map
> Some bad artifact spiking happened twice
> Hard crash, where the screen goes black and but some textures stay on the screen in blue and red colors.
> Some BSODs where the reason is a file that starts with nv
> 
> I was kind of thinking "damn...my card is broke" ... but it seems lots of people are having issues with the drivers.
> 
> did 2 hours of testing using 3D Mark stability tests and everything was fine, both stock or OC'led.
> 
> 
> One of the examples of what was my screen the other day ... perfect visibility
> 
> This "artifact" was very dynamic, it kind of looked like the laser effects you see at the nightclubs or trance shows ... depending on where I looked it would change form and color, always with this "spiking effect" animation.
> 
> View attachment 2486398
> 
> 
> 
> After a fresh windows install, DDU and NVClean Install ... even tried installing the default ( w/ experience app included ) because who knew there wasn't a catch ... No change.
> 
> Ok, The artifacts don't show that often, but crashes are on a daily basis. 3 days ago I actually got a piece of a rock texture that fixed on the screen and I took it everywhere I was going ... lol, just weird


Have not seen this on a 3090. I did not compare with rebar on or off in this game.
Try deleting your shader cache in c:\program data\nv_cache , by first turning off shader cache in the NVCP, then browsing to that file location, deleting all the files in it, then reboot, enable shader cache in the NVCP and have it rebuild the cache.


----------



## Pedros

Falkentyne said:


> Have not seen this on a 3090. I did not compare with rebar on or off in this game.
> Try deleting your shader cache in c:\program data\nv_cache , by first turning off shader cache in the NVCP, then browsing to that file location, deleting all the files in it, then reboot, enable shader cache in the NVCP and have it rebuild the cache.


Gonna try that, thank you.

Just tried how COD:MW and it was crashing like crazy until I forced it to use DX11. No crashes afterward. Apex, on the other hand, although I've not experienced any weird artifacts lately ... now and then I get a crash ... pfff :x


----------



## Panchovix

Imprezzion said:


> Battlefield 4 1080p 150% res scale all max is one of the tests, it stays just under the power limit, and Cyberpunk 2077 1080P RT Psycho with DLSS on Performance is as well. The higher you put the DLSS (so the lower the resolution) the less power draw it has. It looks horrible but is a great stability test as it does use full ray tracing and DLSS calculations and complex scenes.


To add, today tested BF4 at 150% scale maxed, managed to get 2160-2175Mhz at 1.1V if I'm not powerlimited, at 2190Mhz it crashes, so I guess it's pretty decent for a TUF card.

Wish I wasn't power limited man haha


----------



## Imprezzion

Great news! The EVGA FTW3 Hybrid cooler showed up today and it fits for the most part on the Gigabyte Gaming OC. The waterblock and the VRAM plate fit without any modifications and I can be pretty sure they will both fit any model 3080 just fine. The power plug for the pump is quite different but if you reverse it and cut the locking tab off it does fit in the fan power slot and it does work with PWM RPM and power as long as you reverse it in the socket for the fans. The pinout is the other way around.

All I gotta do now is find a way to cool the VRM's which probably ends up in sacrificing the Gaming OC stock cooler (and heat pipes unfortunately) and just using both VRM pieces as normal heatsinks without heat pipes. The VRM pads aren't connected to the heat pipes anyway. Or I have to glue on some copper Alpha cool heatsinks..

Temperature tests and pics coming later. So far the core was idling at 21c so.


----------



## Imprezzion

Imprezzion said:


> Great news! The EVGA FTW3 Hybrid cooler showed up today and it fits for the most part on the Gigabyte Gaming OC. The waterblock and the VRAM plate fit without any modifications and I can be pretty sure they will both fit any model 3080 just fine. The power plug for the pump is quite different but if you reverse it and cut the locking tab off it does fit in the fan power slot and it does work with PWM RPM and power as long as you reverse it in the socket for the fans. The pinout is the other way around.
> 
> All I gotta do now is find a way to cool the VRM's which probably ends up in sacrificing the Gaming OC stock cooler (and heat pipes unfortunately) and just using both VRM pieces as normal heatsinks without heat pipes. The VRM pads aren't connected to the heat pipes anyway. Or I have to glue on some copper Alpha cool heatsinks..
> 
> Temperature tests and pics coming later. So far the core was idling at 21c so.


Hopefully this Akasa thermal tape is strong enough to not drop the pretty heavy heatsinks. I ended up harvesting the heatsinks from the EVGA Hybrid FTW3 shroud. They fit with a bit of trimming on a sanding block and angle grinder. Also drilled mounting holes for my Arctic backplate and trimmed it so that it will fit with a few small srews and thumbnuts and has a proper mount now. For fans it has push-pull Cooler Master MF120 ARGB's on the rad and the stock EVGA FTW3 ARGB fans are mounted to the card on the bottom for airflow over VRM / VRAM / Pump. All controlled through a SATA powered fan controller that gets it's RPM and PWM signal from the pump fan wiring so that I don't overload the GPU's fan controller. 

Temps: This is with CP2077 1.12 1080P all max, RT Psycho, DLSS Quality and just running around doing some side missions. (+105 core +1400 memory)










So, all in all dropped 10c core and hotspot (even tho the difference is still quite big) and 20c memory with this mod. And, best of all, it went from a windtunnel to absolute silence. Fans are barely spinning up at 1060RPM. They go as high as 2000RPM if I have to.. 

Pics coming from my phone, takes a while to upload.


----------



## Hiikeri

Panchovix said:


> To add, today tested BF4 at 150% scale maxed, managed to get 2160-2175Mhz at 1.1V if I'm not powerlimited, at 2190Mhz it crashes, so I guess it's pretty decent for a TUF card.


Btw, ASUS bioses show +30-50Mhz higher target Mhz than other manufacturers bioses.

Target? If you look clocks at @ Afterburner, GPUZ or another Manufactures OC app your GPU clocks, those clocks aren't real Mhz, them are only clocks that card try to run, and 95% any of cards cant reach it.

You, and everyone else must get an effective clockspeed, and currently HWInfo, i think, it's only software that show to us those real clocks what card is running.

What about Asus "cheat" clocks, my MSI SuprimX original bios, show example Afterburner 2100Mhz and my Effective clocks are 2080Mhz.

I use Asus Strix OC bios an couple weeks, and i "get" more speed, I get 2130Mhz on afterburner, but my effective clocks was same than with MSI bios, 2080Mhz.

You can see that all reviews since last autumn, "on all" reviews Asus cards get +30-50Mhz higher clocks on overclocking tests than another factories cards, still FPS could be same than example MSI or Evga cards.

It also depends how to overclocking , slider or V/F curve. V/F curve overclocking you get less effective clocks than basic slider overclocking /w same "request" clocks = less true power.

Example, my card V/F curve OCd 2100Mhz, Effective clocks ~2050Mhz.
But on slider OCd 2100Mhz, Effective clocks ~2090Mhz.


----------



## Imprezzion

I noticed the same where on my card the EVGA XC3 BIOS had consistently higher effective clocks compared to the Gigabyte Gaming OC one. I did go back to Gaming OC BIOS now but a different newer version (not the rebar one) and now it's basically the same. At 1995-2010 reported I see about 1960-1980 effective while slamming the power limit as you can see in my above HWINFO64 monitoring (1979 effective at 1995 reported).

Pics of the mod I did:

Block mounted using original EVGA FTW3 mounting hardware + VRAM plate.









Arctic Accelero IV backplate drilled and modded to mount to the original backplate holes with Arctic Accelero IV mounting screws and thumb nuts.

Stock EVGA radiator fans mounted to the card for VRAM and VRM airflow and card mounted in the case. Yes the 8 pin converter is ugly.









All done, rads in, fans in, card fully working and RGB as well. Even saved the stock Gigabyte RGB logo. Speed control on both the rad and card fans through MSI AB working.


----------



## Panchovix

Hiikeri said:


> Btw, ASUS bioses show +30-50Mhz higher target Mhz than other manufacturers bioses.
> 
> Target? If you look clocks at @ Afterburner, GPUZ or another Manufactures OC app your GPU clocks, those clocks aren't real Mhz, them are only clocks that card try to run, and 95% any of cards cant reach it.
> 
> You, and everyone else must get an effective clockspeed, and currently HWInfo, i think, it's only software that show to us those real clocks what card is running.
> 
> What about Asus "cheat" clocks, my MSI SuprimX original bios, show example Afterburner 2100Mhz and my Effective clocks are 2080Mhz.
> 
> I use Asus Strix OC bios an couple weeks, and i "get" more speed, I get 2130Mhz on afterburner, but my effective clocks was same than with MSI bios, 2080Mhz.
> 
> You can see that all reviews since last autumn, "on all" reviews Asus cards get +30-50Mhz higher clocks on overclocking tests than another factories cards, still FPS could be same than example MSI or Evga cards.
> 
> It also depends how to overclocking , slider or V/F curve. V/F curve overclocking you get less effective clocks than basic slider overclocking /w same "request" clocks = less true power.
> 
> Example, my card V/F curve OCd 2100Mhz, Effective clocks ~2050Mhz.
> But on slider OCd 2100Mhz, Effective clocks ~2090Mhz.


Oh I know, the effective clocks were 2150-2160Mhz depending of the scene in the game (actually played a little, BF4 story isn't bad at all to my surprise lol)

I just set an offset (+255) and then just moved the 2175Mhz dot from 1.093V to 2160Mhz, since I didn't want to go below 1.1V if I weren't power limited.

For example in my 3060Ti (Gigabyte) my max OC was 2145Mhz but effective clocks were 2120-2110Mhz in average or something like that

Here is a pic of HWiNFO64 and BF4 in-game (RTX 3060 = 3060Ti, have to change the name lol)


----------



## BluePaint

Does somebody have a Resizable BAR Suprim X or Strixx OC BIOS to cross flash on a Trio?
The Techpowerup database only seems to have the ReBAR version for the FE.


----------



## Hiikeri

Search it from TPU unofficial bios database.


----------



## edhutner

@BluePaint here is my, uploaded via GPU-Z








MSI RTX 3080 VBIOS


10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory




www.techpowerup.com





Strange is that when I download my file from techpowerup it is binary different than the one that I save to my pc with gpu-z.
So below I putting additional downloads:
Saved with gpu-z: 391.5 KB file on MEGA
Update file coming from MSI live update: 391.4 KB file on MEGA


----------



## BluePaint

Hiikeri said:


> Search it from TPU unofficial bios database.


Ah, yes forgot about unverified DB, thanks!

@edhutner
Thanks, worked!


----------



## EarlZ

Hiikeri said:


> It also depends how to overclocking , slider or V/F curve. V/F curve overclocking you get less effective clocks than basic slider overclocking /w same "request" clocks = less true power.
> 
> Example, my card V/F curve OCd 2100Mhz, Effective clocks ~2050Mhz.
> But on slider OCd 2100Mhz, Effective clocks ~2090Mhz.


I think MSI did great with the Suprim X bios and how it handles overclocking with undervolting.

I get the same effective clocks regardless of my overclocking method, Offset slider, 1 Point VF Curve, 8-10 point VF Curve with 15Mhz increments or even the recommended 3 voltage points for 1 frequency. All will result in the same effective clock for me. This is probably what made it hard for me to understand which is the better as all 3 methods give me the same effective clock is is around 10-20Mhz less than my target clock


----------



## ducegt

Who cares about a gap in the effective clocks? It's performance that matters (unless you want to balance energy use I guess.) For my card, I use a VF curve for more performance than offsetting the entire curve.


----------



## EarlZ

ducegt said:


> Who cares about a gap in the effective clocks? It's performance that matters (unless you want to balance energy use I guess.) For my card, I use a VF curve for more performance than offsetting the entire curve.



Because that is where the performance comes from, a core clock of 2100 vs 2000 if both have the same 1980Mhz effective will perform at the same speed.


----------



## ducegt

EarlZ said:


> Because that is where the performance comes from, a core clock of 2100 vs 2000 if both have the same 1980Mhz effective will perform at the same speed.


While we're just making up numbers, how the cards boost, and theorizing... let's just say OC A has a 120mhz gap and OC B has only a 20mhz gap like in your example. If OC A can be pushed to 2120 (2000 effective) and OC B is maxed at 2000 (1980 effective), then it's easy to understand which one is faster. All I'm saying is the gap arguably has nothing to do with "how people should be overclocking." Use a benchmark to determine which method nets the best performance because you can't use a single clock speed at specific period of time to tell what's faster during dynamic situations.

To spell it out, I have my card set to max at 2175 1.1v with custom VF curve because 1) it's stable and I push as much as I can, 2) it actually does boost that high momentarily in benches and games, and 3) there's even some games it'll hold ~2160 like Half Life Alyx. However, I can't offset the whole curve to reach 2175 1.1v because it's unstable and I could careless if I could run the whole curved reduced at 2100 so the effective gap would be lessened because it nets less performance. So basically, whatever nets higher effective clocks is indeed important, but the gap is not.


----------



## EarlZ

ducegt said:


> So basically, whatever nets higher effective clocks is indeed important, but the gap is not.


That is exactly my point, the effective clock drives the performance not the gap hence the example of 2100Mhz core vs 2000Mhz core when both are at 1980Mhz effective clock. You just wanted a more complex explanation, but you do you.


----------



## marashz

When my XC3 was on stock air cooler (as we all know, it's total garbage), I got better score in 3DMark benchmarks with UV ant flat line. Now I tried making undervolted OC with custom curve, and also just pure +155 on core. Haven't done many custom curve benchmarks, but atm pure +155 gives best results, then flat UV, then custom curve.

But card is so Power limited... In PoE with max settings, but without Global Illumination, on 5120x1440 I have 120fps ant core clock sits in 2040-2085MHz range, but when I enable Global Illumination, insta 60fps, core clock below 1900MHz... So it feels like 0.1% and 1% in games will be same without OC, or with w/e OC I make, unless I shunt mod card. Avg and max is different story, but I don't care if I have 140fps, or 160fps AVG :/


----------



## EarlZ

marashz said:


> When my XC3 was on stock air cooler (as we all know, it's total garbage), I got better score in 3DMark benchmarks with UV ant flat line. Now I tried making undervolted OC with custom curve, and also just pure +155 on core. Haven't done many custom curve benchmarks, but atm pure +155 gives best results, then flat UV, then custom curve.
> 
> But card is so Power limited... In PoE with max settings, but without Global Illumination, on 5120x1440 I have 120fps ant core clock sits in 2040-2085MHz range, but when I enable Global Illumination, insta 60fps, core clock below 1900MHz... So it feels like 0.1% and 1% in games will be same without OC, or with w/e OC I make, unless I shunt mod card. Avg and max is different story, but I don't care if I have 140fps, or 160fps AVG :/


What a coincidence, A few hours ago I was testing how PoE performs on a 3090 as a friend wanted to build a new rig that can give him the best performance in PoE with out overspending on the CPU.Global chat was about some one talking how they are getting 240fps on 5120x1440.

When I maxed out everything, I was shocked to see global illumination take up about 50% of the GPU power.


----------



## marashz

EarlZ said:


> What a coincidence, A few hours ago I was testing how PoE performs on a 3090 as a friend wanted to build a new rig that can give him the best performance in PoE with out overspending on the CPU.Global chat was about some one talking how they are getting 240fps on 5120x1440.
> 
> When I maxed out everything, I was shocked to see global illumination take up about 50% of the GPU power.


I'm super new in PoE and spend in game like 1h or so, but from my experience, enough to lower Shadow + GI Quality to High and here we have 100fps+. I won't turn off GI, as it changes lighting so much and looks really cool.

GI on + Quality Ultra - 60fps, 317W and 1835MHz effective clock
GI on + Quality High - 110fps, 317W and 1930MHz effective clock
GI off + Quality High - 165fps, 317W and 2010MHz effective clock

My ****ty XC3 just don't want to go over 320W


----------



## Imprezzion

Flash Gigabyte Gaming OC BIOS on it. It might behave better. XC3 BIOS worked fine on my Gaming OC so the other way around should also work fine I guess?


----------



## Panchovix

There is not GPU with more than 375W max TDP on 2x8 pin the 3080 right? A difference vs the 3090 where there is some at 400W with 2x8


----------



## marashz

Imprezzion said:


> Flash Gigabyte Gaming OC BIOS on it. It might behave better. XC3 BIOS worked fine on my Gaming OC so the other way around should also work fine I guess?


I have tried Asus TUF bios, same 320W with 110% power limit. But I might try Gigabyte one day


----------



## Panchovix

marashz said:


> I have tried Asus TUF bios, same 320W with 110% power limit. But I might try Gigabyte one day


If the issue happens as well, I guess the only solution we have is shunt modding ,sadly.


----------



## rjrusek

Water block options for EVGA GeForce RTX 3080 FTW3 ULTRA GAMING?

I am starting my research on water block options for the EVGA RTX 3080 FTW3 ULTRA. Can the people that have gone that route for that card chime in and giver their opinion?

Thank you in advance,
RJR


----------



## marashz

rjrusek said:


> Water block options for EVGA GeForce RTX 3080 FTW3 ULTRA GAMING?
> 
> I am starting my research on water block options for the EVGA RTX 3080 FTW3 ULTRA. Can the people that have gone that route for that card chime in and giver their opinion?
> 
> Thank you in advance,
> RJR


I'd go with Bykski. But just because of price. 100-120€ from aliexpress with backplate, temps maybe 1-2C higher than other blocks.
Other options 220€ for EK (170-175€ waterblock + 43-50€ for backplate), or Optimus 380$ (Idk if they sell in Europe).
=== EDIT
Or wait for Waterblock Heatkiller V, priceI guess will be about 220€ with backplate, but have no idea when they gonna release FTW3 block.


----------



## Imprezzion

Did a remount of the block on mine (EVGA Hybrid FTW3 block on a Gigabyte Gaming OC) because I saw core temps at 54-55c and hotspot way way off at 70-72c indicating a pretty poor mount overall. I was correct, was a bit too little paste to fully cover the die. Put a slightly thicker line on it and increased the mounting pressure slightly by tightening the screws with springs a bit more then before and now it's core 50-51c with hotspot sitting around 63-64c. Still a pretty big difference but it also did this on the stock cooler so probably normal I guess? This is at +105 slamming the power limit at 345-350w the whole time and with not very high fanspeed on the radiator fans (~1150RPM). I'm happy with the Hybrid mod even tho it was like €170 shipped. Still cheaper then full cover and with it being as power limited as it is it won't ever see more load / temperature except for ambient changes. VRAM is at around 72-74c at +1400 as well so. Great temps there.


----------



## rjrusek

marashz said:


> I'd go with Bykski. But just because of price. 100-120€ from aliexpress with backplate, temps maybe 1-2C higher than other blocks.
> Other options 220€ for EK (170-175€ waterblock + 43-50€ for backplate), or Optimus 380$ (Idk if they sell in Europe).
> === EDIT
> Or wait for Waterblock Heatkiller V, priceI guess will be about 220€ with backplate, but have no idea when they gonna release FTW3 block.


Heard that there have been many issues with FTW3 card dying after installing a EK water block?

Has anyone hear experienced their FTW3 dying after water block installation?


----------



## bmgjet

EKs block quality is the lowest of all the blocks this gen. I had EK blocks on my 980ti,1080ti and 2080ti which were all good quality so though the 3xxx series ones would be good as well but I was mistaken and will never buy EK again.

There XC3 block would of killed my card if I didnt notice that the backplate was touching bits and would of shorted out the card if I powered it up, Which a few people didnt notice on the EVGA forums and ended up with dead cards. Thats not the only problems tho.

VRM contact patches dont line up or make contact with the VRM ICs.
VRAM contact patches are half the vram width miss aligned.
Die Contact patch is massivly convex and only made contact with 2/3 of the die.

Contacting there support about it and responce was I could send it back for a partial refund and I had to use the shipping company of there choice or they wouldnt accept it back.
Shipping worked out at more then the cost of the block.
So basically.
Paid $280 for block and back plate, $40 for shipping from them to me, $90 on tax.
Then to return it. DHL which they required me to use to send it back. Wanted $399 for shipping since they dont exsist over here so it had to go though a courier forwarder to a DHL office over seas.
Then they would of given me a $120 refund.

I ended up making the block just work. But having paid that much you shouldnt have to go though machining work yourself.
CNC the VRAM and VRM contact patches off. Then make up copper shims to it in the correct places. Lap the die contact patch flat and mod the stand off height to make up the difference.
Double up nylon washers on the back plate to space it away from shorting bits. Then use 3mm thermal pads.

Took a bit of fluffing around to get a good mount tho. First mount had me at 58C during gaming. Took a few thou off the stand offs and dropped it to 50C. Took a few more thou off and its at 45C.
Could probably go a little further but I got over it so its just been running like this for last few months.

Havnt seen there FTW3 block but looking on the EVGA forums it isnt any better.


----------



## Imprezzion

How are the hotspot and VRAM temps after all that modding hehe. They any good now?

If I ever full cover my card, which I probably won't do it being a 2x8 pin and I wanna get a 3x8 pin first, I would go for a Bykski block.


----------



## bmgjet

Hotspot within 8C
Vram stays same temp as core, Should of mentioned this is on a 3090 as well.
I see in your sig that you have a Civic, If your looking for some tuning software for it my program is free youd just need to OBD1 swap it.








Honda Tuning Suite | OBD1, and CANBUS Tuning


Honda Tuning Suite is a free tuning solution for Honda ECU’s. All of the features you need to tune OBD0, OBD1 and CANBUS platforms including 8th Gens.



hondatuningsuite.com


----------



## arrow0309

Hiikeri said:


> Search it from TPU unofficial bios database.


I'm also interested, found this one only:









MSI RTX 3080 VBIOS


10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory




www.techpowerup.com





Unverified, should I give it a try?
Never flashed an unverified one


----------



## Imprezzion

I'm running an unverified but seemingly original Gigabyte ReBAR BIOS on my Gigabyte Gaming OC, runs fine and ReBAR works great.


----------



## arrow0309

Imprezzion said:


> I'm running an unverified but seemingly original Gigabyte ReBAR BIOS on my Gigabyte Gaming OC, runs fine and ReBAR works great.


Thanks already flashed it (the Suprim X), gonna try this ReBar right away with the ACV


----------



## arrow0309

Got a 3 fps boost lol


----------



## gl0ckc0ma

How do I sign up to this club?

I got an Asus Rog Strix 3080 last December. Been a great GPU so far. Have a stable overclock of 2105mhz, any higher it crashes.


----------



## Imprezzion

Is it normal for GPU hotspot temps to be as much as 15c higher then actual core temps? I kinda question my mount a bit seeing that much deviation.. I use a EVGA Hybrid FTW3 waterblock on my card (not a EVGA card) and core sits around 50-51c and hotspot around 62-66c at +105 core (around 2055-2070Mhz).


----------



## marashz

Imprezzion said:


> Is it normal for GPU hotspot temps to be as much as 15c higher then actual core temps? I kinda question my mount a bit seeing that much deviation.. I use a EVGA Hybrid FTW3 waterblock on my card (not a EVGA card) and core sits around 50-51c and hotspot around 62-66c at +105 core (around 2055-2070Mhz).


Igor'sLAB says it's between 12 and 20 degrees.


----------



## mouacyk

Imprezzion said:


> Is it normal for GPU hotspot temps to be as much as 15c higher then actual core temps? I kinda question my mount a bit seeing that much deviation.. I use a EVGA Hybrid FTW3 waterblock on my card (not a EVGA card) and core sits around 50-51c and hotspot around 62-66c at +105 core (around 2055-2070Mhz).


I see around 11C on my Bykski full waterblock. 44C on GPU Temp and 55C on hot spot.


----------



## Imprezzion

So, basically, my temps are perfectly reasonable lol. After a evening of gaming at +105 core +1250 VRAM on black ops cold war RT / DXR enabled, Cyberpunk RT Psycho, Division 2 and maybe some Battlefield 5l DXR Ultra) the max temps I saw in HWINFO64 were 54.1c core 68.9c hotspot 76c VRAM. That seems in line enough to me to not warrant a full remount again. Also, of course the AIO isn't going to get as good of a temperature as a full cover will especially with my really bad low pressure fans (only selected on looks) but the card cannot put out any more heat because it's just riding the 345-350w limit the whole time anyway.


----------



## Falkentyne

Imprezzion said:


> Is it normal for GPU hotspot temps to be as much as 15c higher then actual core temps? I kinda question my mount a bit seeing that much deviation.. I use a EVGA Hybrid FTW3 waterblock on my card (not a EVGA card) and core sits around 50-51c and hotspot around 62-66c at +105 core (around 2055-2070Mhz).


I get about 10C-11.7C on my 3090 FE after my last repaste and switching from Odyssey 1.5mm pads to the softer Gelid extreme 1.5mm pads, stock cooler.

(Exactly 10C on the spot in Fortnite uncapped FPS @ 530W, DX12+No RT (tested in main menu at 403 FPS), Heaven (550W) is anywhere between 10.2C-10.9C, Overwatch @ 4k @ capped 400W TDP 10.7C-11.7C.

I think 15C is a bit on the high side, especially with water at such a low temp. May be low pressure on the core (too thick or not squishy enough thermal pads?)


----------



## RobertoSampaio

Is anyone heaving driver crash with 466.27 ?


----------



## Imprezzion

Falkentyne said:


> I get about 10C-11.7C on my 3090 FE after my last repaste and switching from Odyssey 1.5mm pads to the softer Gelid extreme 1.5mm pads, stock cooler.
> 
> (Exactly 10C on the spot in Fortnite uncapped FPS @ 530W, DX12+No RT (tested in main menu at 403 FPS), Heaven (550W) is anywhere between 10.2C-10.9C, Overwatch @ 4k @ capped 400W TDP 10.7C-11.7C.
> 
> I think 15C is a bit on the high side, especially with water at such a low temp. May be low pressure on the core (too thick or not squishy enough thermal pads?)


Might be. It's the stock pads from the EVGA VRAM plate so. The plate does contact the block so it can prevent it from mounting properly pressure wise I guess.


----------



## TedBrosby

Hey everyone. 
I'm in a bit of a deliberation period. 
I got a hold of a Barrow CPU Pump Waterblock combo for AM4 processors and I want to throw a custom loop into my CoolerMaster NR200 with 2x 240mm thin radiators. Unfortunately, my 3080 (Gigabyte Gaming OC) is one of those "not as popular" versions that only has Barrow/Bykski waterblocks. Alphacool recently released a block for it, so now there are three choices, but my main issue is: 
The Gigabyte has this strange mini connector to 8 pin. They include an adapter and most of the previous water-blocks either repurpose the included back-plate so you can use the same connector mounted, or they have their own way of mounting the connector. I'm just not a fan of it, aesthetically. 
Also, the truth is, I just really prefer the Founder's Edition or Strix or even EVGA because those versions have access to blocks like the Corsair GPU block or EK types. 

But given how difficult it is to get a hold of a 3080 to begin with, do I just bite the bullet and be happy with what I have and get the Barrow block for it? (Not a big fan of the Bykski block)


----------



## marashz

TedBrosby said:


> Hey everyone.
> I'm in a bit of a deliberation period.
> I got a hold of a Barrow CPU Pump Waterblock combo for AM4 processors and I want to throw a custom loop into my CoolerMaster NR200 with 2x 240mm thin radiators. Unfortunately, my 3080 (Gigabyte Gaming OC) is one of those "not as popular" versions that only has Barrow/Bykski waterblocks. Alphacool recently released a block for it, so now there are three choices, but my main issue is:
> The Gigabyte has this strange mini connector to 8 pin. They include an adapter and most of the previous water-blocks either repurpose the included back-plate so you can use the same connector mounted, or they have their own way of mounting the connector. I'm just not a fan of it, aesthetically.
> Also, the truth is, I just really prefer the Founder's Edition or Strix or even EVGA because those versions have access to blocks like the Corsair GPU block or EK types.
> 
> But given how difficult it is to get a hold of a 3080 to begin with, do I just bite the bullet and be happy with what I have and get the Barrow block for it? (Not a big fan of the Bykski block)


Ok, so.., I have Evga XC3, which is total garbage. And who knows, maybe my card is defective, because max power consumption is 320W, and 8pin #1 is always about 30W lower than 8pin #2. Stock air cooler was garbage too, loud, not very effective. Memory OC low too (+800mem max on air). I decided to queue for Evga step-up (for 3080 FTW3). After few months I bought waterblock. Spend 220€ for it, even if I will have to sell it after step-up and lose like 100€. But who knows when it will be.

I'd say if you don't want to sit for week or two 24/7 at alert bot, just keep your Gigabyte.

Or there can be another option. Soon 3080Ti will be released. You can try luck selling your card, and trying to get 2nd hand rtx3080 (but I doubt it will be easy, I guess most will keep original 3080 for mining, or will sell for high...).


----------



## edhutner

Anybody getting probably "false" temperature limits ?
Sometimes shown only in afterburner, sometimes in hwinfo, sometimes in both.
However every time I check the max temperatures are not so high to expect temperature limit.


----------



## Falkentyne

edhutner said:


> Anybody getting probably "false" temperature limits ?
> Sometimes shown only in afterburner, sometimes in hwinfo, sometimes in both.
> However every time I check the max temperatures are not so high to expect temperature limit.
> View attachment 2489414


Known bug that started happening ever since the first 457.xx drivers dropped.
456.98 (hotfix) was the last driver without such problems. This seems to be caused randomly by a change in power states at low load or during interruptions at high load.


----------



## edhutner

Thanks Falkentyne.
I was worrying that I have hardware issue or bad oc.


----------



## Jack Von

Hello, trying to build new PC with RTX 3080 Founder's Edition and saw that it is possible to have issues with some PSU (power supply unit) when it turn off when using with RTX 3080.

Please write the exact model of PSU you are using with RTX 3080 without any problems/issues. It will help me a lot. Thanks to all who will decide to answer.


----------



## EarlZ

Imprezzion said:


> Is it normal for GPU hotspot temps to be as much as 15c higher then actual core temps? I kinda question my mount a bit seeing that much deviation.. I use a EVGA Hybrid FTW3 waterblock on my card (not a EVGA card) and core sits around 50-51c and hotspot around 62-66c at +105 core (around 2055-2070Mhz).


Based on the 3090 thread its normal to see between 8-11c, 15c seems a little to high. Stock paste?


----------



## marashz

First I bought Heatkiller V without backplate, because on website was written that it's compatible with original backplate. Anyway 3080 XC3 have nothing to cool on it's back of pcb. Later, after I received water block, realised that original backplate is NOT compatible. Few months later, decided to order XC3 backplate.

Idk if it's just my card, but Heatkiller V XC3 comes with 0.5mm pads for memory chips, and with it I had 100C memory just 10s after started mining, gpu chip max like 45C, hotspot +11-14C, can't remember. All temps were max measured by Gpu-Z.
Now with backplate, tried same 0.5mm pads, still very high memory temps. Now I have 0.5mm on top/bottom chips, and 1mm on sides. A big higher chip temp (max saw 47C), hotspot 60C, memory while gaming up to 58C. Memory while mining up to 68C.

All this with pump at 2150rpm. If I max out pump, chip temps goes to 42C under max load, memory very little different.

TL;DR: Waterblock manual says to use 0.5mm on memory chips, I used 1mm on some, and 0.5mm on others, gained few C on chip, but memory over 30C cooler.


----------



## archaon89

hi guys, I'm new to the forum.
I have a ZOTAC RTX 3080 Trinity OC, I would like to replace its bios to increase the power limited, now it is around 330w (105%).
which bios do you recommend for my card?
thank you


----------



## davidm71

Hi guys,

Getting high vram temps reaching 102 degrees C according to GPU-Z while placing gpu under 100% load. This kind of worries me. The GPU model is an MSI 3080 Ventus OC. What temps are you guys getting and is this normal?

Thanks

After some research it seems for some cards this is normal. I also read that EVGA FTW3 3080 cards get the lowest temps in the high 80s vs 100s for me with this MSI.


----------



## Glottis

davidm71 said:


> Hi guys,
> 
> Getting high vram temps reaching 102 degrees C according to GPU-Z while placing gpu under 100% load. This kind of worries me. The GPU model is an MSI 3080 Ventus OC. What temps are you guys getting and is this normal?
> 
> Thanks
> 
> After some research it seems for some cards this is normal. I also read that EVGA FTW3 3080 cards get the lowest temps in the high 80s vs 100s for me with this MSI.


Memory temp goes up to 88C on my TUF OC.


----------



## Imprezzion

I am so massively frustrated right now. Like, what do I have to do to get proper temps on a 3080...

I run a full EVGA FTW3 Hybrid block on mine and while VRAM temperatures are great, 72-74c the whole time, the core is still terrible.. after 3 remounts it hasn't changed 1 degree.. I even put over 100 euro's worth of super high performance static pressure fans (Alpenfohn Wing Boost 3 2200RPM) on the rad in push pull and (at 1350RPM) the core sits at 55-58c and hotspot 70-72c like.. first I'm still quite shocked with how far hotspot is off of edge / core temp and it still sounds super high for running a proper 240 AIO designed specifically for a 3080.. 

Do I really have to remount again with different paste or whatever? Like, every time I pull it apart the spread was perfect and it's not like I'm using a bad paste, Prolimatech PK-3 is a top 10 paste and I've used it for years with great success. 

I do have NT-H1, the new MX5 and Kryonaut laying around so maybe I should remount again.. or maybe I need to use liquid metal? I have both Conductonaut and Liquid Ultra here so..


----------



## ducegt

I just did the Timespy Extreme stress test and pushing 430w though my Trio X resulted in the following max temps, core 80C, memory 100C, and hot spot 93C. It's operating properly so no worries here.


----------



## acoustic

Imprezzion said:


> I am so massively frustrated right now. Like, what do I have to do to get proper temps on a 3080...
> 
> I run a full EVGA FTW3 Hybrid block on mine and while VRAM temperatures are great, 72-74c the whole time, the core is still terrible.. after 3 remounts it hasn't changed 1 degree.. I even put over 100 euro's worth of super high performance static pressure fans (Alpenfohn Wing Boost 3 2200RPM) on the rad in push pull and (at 1350RPM) the core sits at 55-58c and hotspot 70-72c like.. first I'm still quite shocked with how far hotspot is off of edge / core temp and it still sounds super high for running a proper 240 AIO designed specifically for a 3080..
> 
> Do I really have to remount again with different paste or whatever? Like, every time I pull it apart the spread was perfect and it's not like I'm using a bad paste, Prolimatech PK-3 is a top 10 paste and I've used it for years with great success.
> 
> I do have NT-H1, the new MX5 and Kryonaut laying around so maybe I should remount again.. or maybe I need to use liquid metal? I have both Conductonaut and Liquid Ultra here so..


First, I think you've been worrying far too much about hotspot temps. 55-58c is great for a 240mm rad. I don't know why you are expecting miracles from a kit that isn't meant to give top notch cooling performance; it's meant to operate silently while being slightly better than the stock cooler, and that's exactly what it does.

I top at 60-63c on my FTW3 Ultra with 3 A12x25s @ 1100rpm (once temps on the VRM hit 48c). I find that heavy raster games tend to beat the card up more than anything else. Besides Metro Exodus, I slap my 450watt power limit constantly while playing Hell Let Loose.

I hit 63c, and the card is still at 2025. The performance difference is so negligible it's not worth going custom loop and dropping $1500 in hardware. I let the Hybrid GPU fan kick up a tad to move some of the heat around the memory and power stages, maybe the fan will get to 50% max when the card is at 63c. It's still nearly dead silent.

You're expecting far too much from the Hybrid kit, and your worrying about temps that are fine. 55-58c is solid. Getting the card down to 45c might net you another ~30Mhz, but at the end of the day it won't be visible in real-world use. I get wanting to get the best out of your hardware, but you've invested a ton of money trying to make a Hybrid kit cool the card like a full-cover block, and it's simply not built for that.

Btw, I'm using the stock paste that was on the hybrid cooler when it came out of the box. I was going to put some KingpinX on it, but forgot to bring it with me when I picked the cooler up. Do I think about pulling the card apart and swapping the paste? Sometimes, and maybe when I'm bored, but I wouldn't do it expecting some massive performance increase; I'd do it for the fact that I'm a psycho and like having the best paste on my hardware.

Either way man.. if you really want to get low temps, custom loop with a full-cover block is the only way to achieve that. All this stuff your doing with the hybrid cooler is trying to put lipstick on a pig. The hybrid kit is good for what it is (silence, slightly better than stock cooler performance) but it's never going to be a super effective cooler with that thin ass radiator and the high heat generated by these cards.


----------



## Imprezzion

acoustic said:


> First, I think you've been worrying far too much about hotspot temps. 55-58c is great for a 240mm rad. I don't know why you are expecting miracles from a kit that isn't meant to give top notch cooling performance; it's meant to operate silently while being slightly better than the stock cooler, and that's exactly what it does.
> 
> I top at 60-63c on my FTW3 Ultra with 3 A12x25s @ 1100rpm (once temps on the VRM hit 48c). I find that heavy raster games tend to beat the card up more than anything else. Besides Metro Exodus, I slap my 450watt power limit constantly while playing Hell Let Loose.
> 
> I hit 63c, and the card is still at 2025. The performance difference is so negligible it's not worth going custom loop and dropping $1500 in hardware. I let the Hybrid GPU fan kick up a tad to move some of the heat around the memory and power stages, maybe the fan will get to 50% max when the card is at 63c. It's still nearly dead silent.
> 
> You're expecting far too much from the Hybrid kit, and your worrying about temps that are fine. 55-58c is solid. Getting the card down to 45c might net you another ~30Mhz, but at the end of the day it won't be visible in real-world use. I get wanting to get the best out of your hardware, but you've invested a ton of money trying to make a Hybrid kit cool the card like a full-cover block, and it's simply not built for that.
> 
> Btw, I'm using the stock paste that was on the hybrid cooler when it came out of the box. I was going to put some KingpinX on it, but forgot to bring it with me when I picked the cooler up. Do I think about pulling the card apart and swapping the paste? Sometimes, and maybe when I'm bored, but I wouldn't do it expecting some massive performance increase; I'd do it for the fact that I'm a psycho and like having the best paste on my hardware.
> 
> Either way man.. if you really want to get low temps, custom loop with a full-cover block is the only way to achieve that. All this stuff your doing with the hybrid cooler is trying to put lipstick on a pig. The hybrid kit is good for what it is (silence, slightly better than stock cooler performance) but it's never going to be a super effective cooler with that thin ass radiator and the high heat generated by these cards.


I understand your point completely. I mean, I put the Hybrid FTW3 on my Gigabyte Gaming OC because the stock cooler had a dead fan motor and I'm not sending it in for warranty as that'll take months and the card is not replaceable for anywhere near what I paid for it.

It fits the block and the VRAM plate as it's the same on all 3080 models, just not the VRM heatsink so I chopped up the stock Gaming OC cooler and used the VRM parts of that. I run it without the shroud and with the FTW3 rad fans on the card itself and I use 4 Alpenfohn Wing Boost 3 ARGB 2200RPM fans in push pull on the rad. Also put a Accelero IV backplate on it drilled to fit the stock screw holes for the stock backplate. Pic:


----------



## acoustic

Imprezzion said:


> I understand your point completely. I mean, I put the Hybrid FTW3 on my Gigabyte Gaming OC because the stock cooler had a dead fan motor and I'm not sending it in for warranty as that'll take months and the card is not replaceable for anywhere near what I paid for it.
> 
> It fits the block and the VRAM plate as it's the same on all 3080 models, just not the VRM heatsink so I chopped up the stock Gaming OC cooler and used the VRM parts of that. I run it without the shroud and with the FTW3 rad fans on the card itself and I use 4 Alpenfohn Wing Boost 3 ARGB 2200RPM fans in push pull on the rad. Also put a Accelero IV backplate on it drilled to fit the stock screw holes for the stock backplate. Pic:
> 
> View attachment 2490177


I hear you man. I get the desire to maximize performance, but like I said .. lipstick on a pig. I think you would enjoy investing in the full-cover block and going balls deep.


----------



## Imprezzion

acoustic said:


> I hear you man. I get the desire to maximize performance, but like I said .. lipstick on a pig. I think you would enjoy investing in the full-cover block and going balls deep.


Well yeah. I mean. My CPU "loop" could use a well deserved upgrade as well and I did look into a full loop before but it would cost me a LOT of money for the loop alone.. 

Nemesis GTX 420 + 280 rads, D5 with tube res, bitspower summit m black edition block for the CPU, Bykski full cover for the GPU, black ZMT tubing, black fittings..

Especially in these more challenging times I'm not spending that on something that technically won't really net me any real-world performance increase.


----------



## acoustic

I know it's expensive. I've been on the fence about doing the same **** haha. My EVGA 360 CLC struggles with the 10900K even with direct-die and 3x A12x25 (I had to lap the CLC coldplate to even get contact with the direct-die. It was incredibly convex.. total nightmare lol). My temps are "ok" but I'm still not happy with it. The problem with going custom loop is the cost .. I could easily get the setup going with my 011D XL, but man.. between fittings, radiators, pump/res, water flow/temp sensor, the blocks (looked at Optimus Foundation and Optimus for GPU, but would probably go with Watercool's product once it's out) and man.. it just adds up big time. I was looking at $900 before I had added the radiators, fittings, soft tubing.. I said **** it and scrapped the entire idea.

The hybrid cost me $150 or something like that, and I already had the Noctua fans waiting. I achieved the same thing which was silence, and now the money I didn't spend goes towards AlderLake hopefully at the end of this year and that 42" LG OLED


----------



## marashz

acoustic said:


> I was looking at $900 before I had added the radiators, fittings, soft tubing.. I said **** it and scrapped the entire idea.


Can you list all parts of your 900$ list? Because now it sounds more like "I checked Ferrari prices, **** cars, I better stay with bicycle".


----------



## Imprezzion

I came out at right around €700 for my setup.

EK Evolv X Distro plate + D5
Nemesis GTX 420 + 280
Bykski 3080 Gaming OC block + backplate
Bitspower summit m mystic black CPU block
EK ZMT tubing (or if it's crap anything else)
Fittings and such.
Fans and fluid I have.


----------



## acoustic

marashz said:


> Can you list all parts of your 900$ list? Because now it sounds more like "I checked Ferrari prices, **** cars, I better stay with bicycle".


Optimus Foundation, Optimus GPU block for 3080 FTW3, Optimus Pump/Res (forgot the size..8.5"?) w/ D5 pump.. I think that was at $800 already? I forget. I wanted the Foundation CPU block and it was easy to order this stuff off the same site.

I'm aware there are cheaper products on the market, and some that are high quality while being priced better, but I did like the styling of the Optimus products. The Pump/Res definitely seemed pretty overpriced.

Either way, the hybrid gets the job done in terms of silence. I would love to throw the 3080 under water, but what would I gain for the investment? Maybe 3-5% performance? The lower temps would definitely open up my 10900K a bit more, but .. what would I gain at 3840x1600?

I'm not worried about the GPU hitting 63c. My 10900K only hits 58-60c in gaming loads. Works for me, and it's quiet. I definitely benefit from the lower ambients in my basement, and the GPU rad running as an intake.


----------



## Imprezzion

acoustic said:


> Optimus Foundation, Optimus GPU block for 3080 FTW3, Optimus Pump/Res (forgot the size..8.5"?) w/ D5 pump.. I think that was at $800 already? I forget. I wanted the Foundation CPU block and it was easy to order this stuff off the same site.
> 
> I'm aware there are cheaper products on the market, and some that are high quality while being priced better, but I did like the styling of the Optimus products. The Pump/Res definitely seemed pretty overpriced.
> 
> Either way, the hybrid gets the job done in terms of silence. I would love to throw the 3080 under water, but what would I gain for the investment? Maybe 3-5% performance? The lower temps would definitely open up my 10900K a bit more, but .. what would I gain at 3840x1600?
> 
> I'm not worried about the GPU hitting 63c. My 10900K only hits 58-60c in gaming loads. Works for me, and it's quiet. I definitely benefit from the lower ambients in my basement, and the GPU rad running as an intake.


My GPU rad is top outtake with the rear fan acting as an intake for fresh air but it does have to do all the exhausting together with a single fan mounted in front of it (it's a 240 rad in the rear 2 slots of the 360 rail and 1 exhaust fan in the front top to exhaust CPU air)


----------



## davidm71

Does anyone know if there are after market heatsinks for the MSI 3080 Ventus? The stock cooler only makes 2/3 contact with the ram! Explains why the ram temps are so high!


----------



## Imprezzion

davidm71 said:


> Does anyone know if there are after market heatsinks for the MSI 3080 Ventus? The stock cooler only makes 2/3 contact with the ram! Explains why the ram temps are so high!


Only thing I know that works is a EVGA Hybrid XC3/FTW3 kit. The block and included VRAM plate will fit any 3080. VRM heatsinks and shroud(?) will probably not tho so you'd have to McGyver something for that.


----------



## felix121

Hi just purchased a GIGABYTE AORUS RTX 3080 Gaming Box External Graphics Card WATERFORCE should arrive in a few days....anyone know about the performance on it?


----------



## lmfodor

Hi! I wonder if anyone tried the new BIOS version of the ASUS TUF 3080. I just noticed that the V5 was released and I can't find the difference with the V4. Any benefits, it's worth the upgrade?
Thanks!


----------



## Panchovix

lmfodor said:


> Hi! I wonder if anyone tried the new BIOS version of the ASUS TUF 3080. I just noticed that the V5 was released and I can't find the difference with the V4. Any benefits, it's worth the upgrade?
> Thanks!


Good question, I have V3 installed on my TUF 3080, wondering if V5 would make a difference in something, it seems was released at 04/30


----------



## Nizzen

lmfodor said:


> Hi! I wonder if anyone tried the new BIOS version of the ASUS TUF 3080. I just noticed that the V5 was released and I can't find the difference with the V4. Any benefits, it's worth the upgrade?
> Thanks!


Have you ever seen a official Gpu bios that provided more performance?


----------



## Imprezzion

Yes, if the power limits or the base clock table is changed it definitely can..


----------



## ducegt

.


----------



## lmfodor

Panchovix said:


> Good question, I have V3 installed on my TUF 3080, wondering if V5 would make a difference in something, it seems was released at 04/30


Well, the description is clear 

NVIDIA Resizable BAR on systems with an ASUS GeForce RTX 30 series graphics card 
This VBIOS update (v5) is offered only to solve issues that may have emerged after an earlier update. If you successfully installed the previous VBIOS (v4) and have no issues, then there is no need to update it.

So, better to stay in 4!


Sent from my iPhone using Tapatalk Pro


----------



## Malpractis

Has anyone used Fujipoly or Thermalright thermal pads with the EK Quantum Vector block&backplate? Just wondering if I should get 1mm or 1.5mm due to how much softer they are than the ones that come with the EK block.


----------



## Falkentyne

Malpractis said:


> Has anyone used Fujipoly or Thermalright thermal pads with the EK Quantum Vector block&backplate? Just wondering if I should get 1mm or 1.5mm due to how much softer they are than the ones that come with the EK block.


Fujipoly and Thermalright pads are not soft at all.
What you want are Gelid Extreme pads.


----------



## Sleepycat

Hi all, I have a question regarding memory overclocking the RTX 3080. I have an EVGA XC3 Ultra and back in January this year, I could only give it a +400 to memory (9902 MHz) while staying stable in Time Spy and Port Royal with the drivers at the time. Fast forward to today, running the latest drivers, I'm somehow able to now achieve a +740 (10242 MHz) memory overclock and still run through Time Spy and Port Royal successfully, and pass 1 hour of OCCT's VRAM stability testing as well. 

Has anyone else noticed anything like this and what are the typical memory overclocks that you achieve with the RTX 3080 on the stock AIB air cooler?


----------



## Malpractis

Falkentyne said:


> Fujipoly and Thermalright pads are not soft at all.
> What you want are Gelid Extreme pads.


Ah okay nice, so I should be fine with going the same thickness as the EK ones. 
Yeah the Gelid is super expensive though comparitvely.

If I did go Gelid should I opt for 0.5mm thicker to account for how soft they are? Or just stick with the 1mm (waterblock) & 1+2mm (backplate)?


----------



## jura11

Malpractis said:


> Ah okay nice, so I should be fine with going the same thickness as the EK ones.
> Yeah the Gelid is super expensive though comparitvely.
> 
> If I did go Gelid should I opt for 0.5mm thicker to account for how soft they are? Or just stick with the 1mm (waterblock) & 1+2mm (backplate)?


EKWB pads are rubbish, they're I think something like 3.5W/mK and Gelid pads are 12W/mK

If they're on expensive side, try get them from Aliexpress usually they are there cheaper a bit

Stick with same thickness if you can get them in thickness which you are need for waterblock or backplate 

Hope this helps 

Thanks, Jura


----------



## Falkentyne

Malpractis said:


> Ah okay nice, so I should be fine with going the same thickness as the EK ones.
> Yeah the Gelid is super expensive though comparitvely.
> 
> If I did go Gelid should I opt for 0.5mm thicker to account for how soft they are? Or just stick with the 1mm (waterblock) & 1+2mm (backplate)?


You are going to have to test this yourself. I don't have your card nor your cooling solution.

The 120mm * 120mm Gelid Extreme (labeled GP-02) pads on Aliexpress are a great value. What is not so great is the slow boat shipping. Having to wait three weeks (if you're lucky) for thermal pads you need right now, turn into a "time is money" argument and then you end up buying the 85mm * 40mm pads from Amazon or Digikey.

Also, there are no 2mm or 3mm Gelid extremes in the 120mm * 120mm giant pads. They only go up to 1.5mm thickness.


----------



## Hiikeri

I think typical OC for Rams are something 800-1200Mhz. 
I have mined my Suprim X > +1500Mhz a couple days stable. Of course memory junction temp was at those clocks +108C (with stock pads).
Good cards (Asus ROG, MSI Suprim) usually goes +1200-1500Mhz @ddr6x.

EVGA 3000-series, at least all Evga 3080 cards are pretty poor, low range cards, even FTW Ultra.

Those cards are with analog voltage regulator.  On market are only couple 3080 card models with those cheap and inaccurate regulators. But those are on XC3/FTW cards. ****ty design.


----------



## mouacyk

It doesn't look like TPU has updated its vBIOS database with ReBAR bioses. Does anyone know of another way to get the various BIOS to try? I'm looking for Gigabyte Gaming OC and EVGA X3 Ultra versions with ReBAR support. Thanks.

EDIT:Nevermind found the unverified section.


----------



## lmfodor

Hi, I’d like tour guidance because I want to change the thermal pads of my Asus TUF 3080 OC and I review in the forum and I saw some photos showing that the thickness of the pads are 3cm for the back and 2cm for the front. I was surprised, I thought they were going to be 1.5 or 1. Also I want to add other pad to the GDDR6 memories, what sizes and brands do you recommend? I saw some in amazon gelid of 15w, are they good? then for the GPU I thought to put the Kryonaut Extreme, do they look good or is it not necessary for the GPU? 

I want to make this change since I notice that my memories are the components that generate the most heat and in fact it is where I can apply the least OC. 

Thank you very much for your advice


Sent from my iPhone using Tapatalk Pro


----------



## AveragePC

I currently have an EVGA RTX 3080 XC3 Black with a 650w psu, but recently picked up an EVGA RTX 3080 FTW3 Ultra. I bought the XC3 initially due to not wanting to replace my psu. Power supply calcs suggest my build load wattage is 479 w, recommended psu is 529w. PSU is the Corsair rmx 650

Is it worth swapping out the XC3 for the FTW3?


----------



## mouacyk

You bought a 3080 FTW Ultra but somehow can't afford to upgrade to a quality higher capacity PSU?


----------



## n00bftw

I currently have an i9 7900x and a 1080TI FE, both are overclocked, CPU is at 4.6Ghz.

 

All of this is running in a single custom waterloop with x2 Alphacool NexXxoS ST30 Full Copper 420mm radiators (Push configuration) and a D5 Pump. The case is a Dark Base Pro 900, one of the radiators is intake (front), and one is exhaust (top), only other fan in the system is an exhaust fan (140mm) out the back of the case.

Current System Temps while gaming:

Water Temp: 40c

Water Pump Speed: 2.9

Radiator Fans RPM: 1000/1100 RPM

CPU Temp: 62-72c

GPU Temp: 46c

Question, if I replace the 1080TI, which has a TDP of 250 stock (FYI my GPU is overclocked so will be higher), with the 3080 (TDP 320) or the 3090 (TDP 350), and don’t overclock the new GPU, (1.) will my radiators be able to handle the extra heat?

(2.) Will I have to increase my fan speed? By how much…

(3.) Will the water loop run hotter, how much hotter? Will it be too hot for the soft tubing I use?

(4.) Will the additional heat of the 3080/3090 heat up my CPU more than it is now?


Thanks in advance.


----------



## AveragePC

mouacyk said:


> You bought a 3080 FTW Ultra but somehow can't afford to upgrade to a quality higher capacity PSU?


I’m being lazy, sorry.


----------



## Imprezzion

You're only uasing a 5600x


AveragePC said:


> I’m being lazy, sorry.


You only have a 5600x which is a low power CPU so it _should_ be fine but it's on the very edge.


----------



## Lt.Tenshi

n00bftw said:


> I currently have an i9 7900x and a 1080TI FE, both are overclocked, CPU is at 4.6Ghz.
> 
> 
> 
> All of this is running in a single custom waterloop with x2 Alphacool NexXxoS ST30 Full Copper 420mm radiators (Push configuration) and a D5 Pump. The case is a Dark Base Pro 900, one of the radiators is intake (front), and one is exhaust (top), only other fan in the system is an exhaust fan (140mm) out the back of the case.
> 
> Current System Temps while gaming:
> 
> Water Temp: 40c
> 
> Water Pump Speed: 2.9
> 
> Radiator Fans RPM: 1000/1100 RPM
> 
> CPU Temp: 62-72c
> 
> GPU Temp: 46c
> 
> Question, if I replace the 1080TI, which has a TDP of 250 stock (FYI my GPU is overclocked so will be higher), with the 3080 (TDP 320) or the 3090 (TDP 350), and don’t overclock the new GPU, (1.) will my radiators be able to handle the extra heat?
> 
> (2.) Will I have to increase my fan speed? By how much…
> 
> (3.) Will the water loop run hotter, how much hotter? Will it be too hot for the soft tubing I use?
> 
> (4.) Will the additional heat of the 3080/3090 heat up my CPU more than it is now?
> 
> 
> Thanks in advance.


I am not really running a similar setup atm (5800Χ,RTX3080) but before upgrading my 2080, gpu temperatures were close enough (44°C )...so here goes:
Phanteks 719 with 360mm UT60, 360mm XT30,480mm XT30 and 10 Bequiet! SW3 running at [email protected] and another 4 140mm SW3 as case intake and exhaust.
5800X: up to 75°C
RTX3080: up to 63 °C
Water temp: 34 °C
Ambient temp: 25~26 °C

The RTX3080 is running quite hot compared to the 2080. It could totally be my fault while installing the waterblock (ekwb), but then again it is 100+++ w. of heat energy in the loop.
So yes, with the gpu transition you will have to ramp up those fans in order to keep the same coolant temp (40°C).The "how much" part depends on many variables like airflow,the fans themselves and heat saturation of the rads. CPU temps will "suffer", always in relation to the coolant temp increase.
Definitely try to keep you coolant below 50°C as soft tubing tends to get...well, even softer and tight bends could easily collapse blocking flow entirely.
I assume you are using a custom fan curve with your coolant temp as a reference.It goes without saying that you will have to revise on that curve to the point you are comfortable with coolant temperature and acoustics.


----------



## n00bftw

Lt.Tenshi said:


> I am not really running a similar setup atm (5800Χ,RTX3080) but before upgrading my 2080, gpu temperatures were close enough (44°C )...so here goes:
> Phanteks 719 with 360mm UT60, 360mm XT30,480mm XT30 and 10 Bequiet! SW3 running at [email protected] and another 4 140mm SW3 as case intake and exhaust.
> 5800X: up to 75°C
> RTX3080: up to 63 °C
> Water temp: 34 °C
> Ambient temp: 25~26 °C
> 
> The RTX3080 is running quite hot compared to the 2080. It could totally be my fault while installing the waterblock (ekwb), but then again it is 100+++ w. of heat energy in the loop.
> So yes, with the gpu transition you will have to ramp up those fans in order to keep the same coolant temp (40°C).The "how much" part depends on many variables like airflow,the fans themselves and heat saturation of the rads. CPU temps will "suffer", always in relation to the coolant temp increase.
> Definitely try to keep you coolant below 50°C as soft tubing tends to get...well, even softer and tight bends could easily collapse blocking flow entirely.
> I assume you are using a custom fan curve with your coolant temp as a reference.It goes without saying that you will have to revise on that curve to the point you are comfortable with coolant temperature and acoustics.


What case have you got?


----------



## Lt.Tenshi

n00bftw said:


> What case have you got?


Phanteks Enthoo 719 (aka Luxe2)


----------



## jura11

Lt.Tenshi said:


> I am not really running a similar setup atm (5800Χ,RTX3080) but before upgrading my 2080, gpu temperatures were close enough (44°C )...so here goes:
> Phanteks 719 with 360mm UT60, 360mm XT30,480mm XT30 and 10 Bequiet! SW3 running at [email protected] and another 4 140mm SW3 as case intake and exhaust.
> 5800X: up to 75°C
> RTX3080: up to 63 °C
> Water temp: 34 °C
> Ambient temp: 25~26 °C
> 
> The RTX3080 is running quite hot compared to the 2080. It could totally be my fault while installing the waterblock (ekwb), but then again it is 100+++ w. of heat energy in the loop.
> So yes, with the gpu transition you will have to ramp up those fans in order to keep the same coolant temp (40°C).The "how much" part depends on many variables like airflow,the fans themselves and heat saturation of the rads. CPU temps will "suffer", always in relation to the coolant temp increase.
> Definitely try to keep you coolant below 50°C as soft tubing tends to get...well, even softer and tight bends could easily collapse blocking flow entirely.
> I assume you are using a custom fan curve with your coolant temp as a reference.It goes without saying that you will have to revise on that curve to the point you are comfortable with coolant temperature and acoustics.


Hi there 

I think you have bad mount of GPU, I would recommend you check TIM imprint on block and core, check height of thermal pads etc 

Did you use different thermal pads or are you using supplied thermal pads from EK and what TIM did you use? 

With such case and such radiator space you should have GPU temperatures close to what you have previously on RTX 2080,in my case at least that's the case, running same radiator space as I have run with 2 RTX 2080Ti's and GTX1080Ti and previously I have run 4*GPUs setup and temperatures been pretty much same in 36-38°C on all GPUs with RTX 2080Ti Strix with XOC BIOS been hottest at 38°C 

Hope this helps 

Thanks, Jura


----------



## Lt.Tenshi

jura11 said:


> Hi there
> 
> I think you have bad mount of GPU, I would recommend you check TIM imprint on block and core, check height of thermal pads etc
> 
> Did you use different thermal pads or are you using supplied thermal pads from EK and what TIM did you use?
> 
> With such case and such radiator space you should have GPU temperatures close to what you have previously on RTX 2080,in my case at least that's the case, running same radiator space as I have run with 2 RTX 2080Ti's and GTX1080Ti and previously I have run 4*GPUs setup and temperatures been pretty much same in 36-38°C on all GPUs with RTX 2080Ti Strix with XOC BIOS been hottest at 38°C
> 
> Hope this helps
> 
> Thanks, Jura


Thanks for the heads up Jura.
Used the supplied Ectotherm tim and pads, following instructions to the letter.
I was planning on opening up the waterblock for cleaning anyway, as there is something formed around the gasket that resembles oil leach.

edit: Aaaand whoops...i did make a mistake on the reported gpu temp (63°C instead of 53°C that i actually have).In any case, it makes no harm checking it out while i am at it.


----------



## badaz

Hello, I'd like to raise the power limit of my MSI RTX 3080 Ventux 3X 10GB. The best candidate BIOS seems to be the one from ASUS RTX 3080 Tuf OC 10GB since power limit is higher and it too has 2x8 PIN connectors. However, the MSI card has 1 x hdmi and 3 x DP ports, whereas the Asus has 2 x hdmi and 3 x DP ports. Is this a problem? Do I risk to get into an unrecoverable state if Asus BIOS is not compatible? Has anyone done this here already?


----------



## acoustic

Finally spent some time since summer started and I have the AC on, and was able to break 13k on my 3080FTW3 Ultra w/ the 240mm Hybrid kit. Prior to AC being on, I was stuck around ~12850.

+130core/+650mem









I scored 13 026 in Port Royal


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## TechnoPeasant

I got an Asus TUF 3080 in the mail today. Looking to get this thing setup to squeeze some more perf out in the next couple weeks. Is it safe to download a TUF OC BIOS from Asus, and attempt to flash it onto the plain ole TUF?


----------



## mouacyk

TechnoPeasant said:


> I got an Asus TUF 3080 in the mail today. Looking to get this thing setup to squeeze some more perf out in the next couple weeks. Is it safe to download a TUF OC BIOS from Asus, and attempt to flash it onto the plain ole TUF?


Yes, just don't interrupt the flashing and make sure your PC is completely stable for the flashing. I wish I can get stuff like that in the mail today.


----------



## CattBoy

Just enabled Resizable Bar on 3080 TuF OC with RTX3080_V5.exe (Have not used any version prior, 1st time enabling RB). Installed GPU Tweak III, ran .exe, changed MoBo values, smooth process.

Now, precision X1 says new vbios. Should I update?










What vbios should I be on with TuF OC? Currently 94.02.42.40.66

Should I use NVFlash to backup 94.02.42.40.66 ??

Thx


----------



## Panchovix

CattBoy said:


> Just enabled Resizable Bar on 3080 TuF OC with RTX3080_V5.exe (Have not used any version prior, 1st time enabling RB). Installed GPU Tweak III, ran .exe, changed MoBo values, smooth process.
> 
> Now, precision X1 says new vbios. Should I update?
> 
> View attachment 2514593
> 
> 
> What vbios should I be on with TuF OC? Currently 94.02.42.40.66
> 
> Should I use NVFlash to backup 94.02.42.40.66 ??
> 
> Thx


There's even a newer one? I've installed the V3 and it doesn't let me to update to V5 lol
If you do a backup, could you upload it to techpowerup, to see if there's any difference between V3 and V5?


----------



## KShirza1

Popped in my old 3080 and bypassed my loop in my main pc while I water cool my 3080 Ti. Forgot to join this group last year!

Build log










3080 powering my c9 in my game room rig


----------



## ssgwright

CattBoy said:


> Just enabled Resizable Bar on 3080 TuF OC with RTX3080_V5.exe (Have not used any version prior, 1st time enabling RB). Installed GPU Tweak III, ran .exe, changed MoBo values, smooth process.
> 
> Now, precision X1 says new vbios. Should I update?
> 
> View attachment 2514593
> 
> 
> What vbios should I be on with TuF OC? Currently 94.02.42.40.66
> 
> Should I use NVFlash to backup 94.02.42.40.66 ??
> 
> Thx


x1 told me I needed an update as well, click update and nothing happens...


----------



## Sleepycat

CattBoy said:


> Just enabled Resizable Bar on 3080 TuF OC with RTX3080_V5.exe (Have not used any version prior, 1st time enabling RB). Installed GPU Tweak III, ran .exe, changed MoBo values, smooth process.
> 
> Now, precision X1 says new vbios. Should I update?
> 
> View attachment 2514593
> 
> 
> What vbios should I be on with TuF OC? Currently 94.02.42.40.66
> 
> Should I use NVFlash to backup 94.02.42.40.66 ??
> 
> Thx


I wouldn't trust EVGA software telling you to update your ASUS card's vbios!


----------



## KShirza1

I’m on my second 3080, first one was a pny at launch wanting to use hdmi 2.1 on my c9 oled for cyberpunk launch. Sold it to a friend for retail once I got my evga card in the post above. There were so many bugs trying to get hdmi 2.1 to work at first and cyberpunk failed…


----------



## CattBoy

ssgwright said:


> x1 told me I needed an update as well, click update and nothing happens...


You live dangerously 


Sleepycat said:


> I wouldn't trust EVGA software telling you to update your ASUS card's vbios!


That was my thinking, but wanted to ask any other TUF OC users what their vBIOS is 

resizable is working and I got ~10% in the title I play so far so win


----------



## sdmf74

Does anyone make a waterblock for the 3070 ti gpu yet?


----------



## Tobitigger

Hey all! I finally built my system with a Gigabyte GeForce RTX 3080 Gaming OC...this is a long card! I guess I am a member of this thread now as well. 

One question to all the lucky 3080 owners out there:
I was looking to get more room by adding a short waterblock to it and was eyeing up the alphacool GPX-N model for it. I was wondering if anyone here has that block and how the power supply system is solved. Just a different set of cables going directly from the flat connectors to the PSU?


----------



## Imprezzion

I do know the Bykski block has some solution for the cable / connector thing. Not sure what exactly but k.

I bought a few custom cables from ModDYI and they just go from PSU PCI-E x8 straight to the flat connector so no more connector block.

I'm running a EVGA FTW3 Hybrid cooler on mine so.. lol.


----------



## Tobitigger

Imprezzion said:


> I'm running a EVGA FTW3 Hybrid cooler on mine so.. lol.


I saw those ModDYI cables, they did look nice on their images, are you happy with these? And also how does the EVGA cooler work for you, do I assume correctly that you use this on your Gaming OC 3080?


----------



## Imprezzion

Yeah I am. The block and VRAM plate are universal for all 3080 models using the standard VRAM arrangement. They fit just fine. 

The shroud and VRM heatsink don't fit. Maybe the XC3 version fits? You have to build a VRM heatsink somehow and I just sacrificed the stock cooler for it. Literally angle grinded both VRM portions out of it and screwed it back on the card without working heat pipes of course but it's been cool enough according to the finger test and the IR thermometer which shows about 55-60c. Back side of the card directly above the VRM doesn't get hot either so should be fine.

Temps at max power limit and +90 core and max voltage (2100Mhz when not throttling due to power) and +1400 memory are around 60c core 74c hotspot 78c VRAM. I don't run the rad fans very high tho. With a bit more noise 10c less can easily be done.

One important point. The pump + fan connector is EVGA PCB proprietary. You have to cut off the locking tab and click it onto the Gigabyte fan header IN REVERSE. The pin out is reversed!

As for the ModDYI cables, they are still stuck in customs...


----------



## AveragePC

is the hybrid kit worth getting for a XC3? Gaming at 4k the card runs around 76/77 when my room starts warming up from the pc.


----------



## Imprezzion

VRAM junction and noise will go way down as will temps so yeah, it is. Especially since you keep your warranty and don't have to cut up the cooler to make it fit as it already is a XC3. Will you get more performance or overclocking? No. Still 2x8 pin @ 366w limit so..


----------



## AveragePC

Hmmm, I'll have to see if I can fit it above my noctua cpu cooler, might be a tight fit


----------



## acoustic

Don’t expect miracles from the hybrid cooler. Temps will drop 10-15c, which is good, but the main benefit is the silence. I think the Hybrid cooler does a good job at what it is, but it struggles on my FTW3 @ 450watt — predictably.


----------



## Tobitigger

Interesting stuff, sounds like a lot of work though! I checked out Bykski and Alphacool, their concept is pretty similar. I may go along with one of them.


----------



## Panchovix

The best I could do on my 3080 TUF at 350W, it's not much but it's honest work lol, the most I can do without being power limited is 2055Mhz at 0.975V, and +1500Mhz VRAM OC (at +1700Mhz, I start to get less performance, but went below that just to be safe lol)
(The GTX 950 was for testing purposes and forgot to take it out lol, but it wasn't included in the benchmarks anyways, it was to test FSR which actually works on Maxwell)









I scored 17 447 in Time Spy


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





















I scored 12 656 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## acoustic

That's a crazy good chip to do 2055 @ .975mv! Imagine if you had 450watt to play with..

I feel like the higher wattage chips got worse silicon because you can just turn up the voltage to achieve frequencies, lol. My 3080FTW3 can't even do 2000Mhz at 1.013mv. I have to run 1.031mv at ~2025.


----------



## Panchovix

acoustic said:


> That's a crazy good chip to do 2055 @ .975mv! Imagine if you had 450watt to play with..
> 
> I feel like the higher wattage chips got worse silicon because you can just turn up the voltage to achieve frequencies, lol. My 3080FTW3 can't even do 2000Mhz at 1.013mv. I have to run 1.031mv at ~2025.


Thanks! And wow, I though EVGA did only use good binned chips on their FTW3 models, like ASUS on their Strix models and such.

Yeah I wish I had more, even if the VBIOS says the max is 375W, the most I've seen is 351W lol


----------



## acoustic

None of the cards are binned. The STRIX isn't binned either, afaik.


----------



## Panchovix

acoustic said:


> None of the cards are binned. The STRIX isn't binned either, afaik.


Oh that's a bummer, I though the models that were more expensive had better binning, I think it was actually like that in 2000 series.


----------



## lowrider

Hi guys. I just got this beast of a card and was wondering if there are any modded bios that unlock voltage to bring it to aorus levels, or at least add any good tweaks. Anyone knows?


----------



## fray_bentos

lowrider said:


> Hi guys. I just got this beast of a card and was wondering if there are any modded bios that unlock voltage to bring it to aorus levels, or at least add any good tweaks. Anyone knows?


The best tweak is to undervolt while keeping stock performance (with massively reduced power consumption, heat and noise).


----------



## lowrider

fray_bentos said:


> The best tweak is to undervolt while keeping stock performance (with massively reduced power consumption, heat and noise).


Tell me more


----------



## blurp

Search more.
Here it is.


----------



## marashz

fray_bentos said:


> The best tweak is to undervolt while keeping stock performance (with massively reduced power consumption, heat and noise).


Depends. When I was on air cooling, then I got better scores with undervolting. On water cooling it's worse than just basic +core +memory.


----------



## fray_bentos

marashz said:


> Depends. When I was on air cooling, then I got better scores with undervolting. On water cooling it's worse than just basic +core +memory.


Sure, but I wouldn't call installing water cooling on a GPU a "tweak".


----------



## fray_bentos

lowrider said:


> Tell me more


I just topped out/flattened my voltage frequency curve in afterburner at 0.831 V at 1800 MHz. That seems to give the best balance of performance / noise for me. Performance matches stock, but doing that has cut my power consumption by ~80 W, temps down by 10-15 C, fan noise went from being audible even when wearing headphones to not being audible at all when wearing headphones (even in silent scenes). Compared to every other Nvidia card I have owned the 3080 behaves more like it is overclocked (a little too much) out of the box.


----------



## Shadowdane

fray_bentos said:


> I just topped out/flattened my voltage frequency curve in afterburner at 0.831 V at 1800 MHz. That seems to give the best balance of performance / noise for me. Performance matches stock, but doing that has cut my power consumption by ~80 W, temps down by 10-15 C, fan noise went from being audible even when wearing headphones to not being audible at all when wearing headphones (even in silent scenes). Compared to every other Nvidia card I have owned the 3080 behaves more like it is overclocked (a little too much) out of the box.


Yah I ultimately ended up under-volting my MSI 3080 Suprim X as well didn't want to drop clocks that low though I went with 0.975v at 2010Mhz. It tends to settle down 1 bin lower at around ~1995Mhz though once the card gets a little bit warmer. At least that was the minimum voltage I could run the card to keep that clock speed stable, which could of course vary per GPU. My card was a stock 430W bios so it dropped me down to around ~350W average depending on the game & GPU utilization.


----------



## Imprezzion

Cables showed up from ModDYI for my Gigabyte 3080 Gaming OC to eliminate that ugly block connector. 










And yes, those are the cut off heatpipes of the stock cooler as that part only cools the VRM and does just fine without the heatpipes. Core is the EVGA FTW3 Hybrid block, front VRM is also cut off cooler and VRAM is part of the waterblock.


----------



## acoustic

You Frankenstein'd the **** outta that card LOL. I love it


----------



## Imprezzion

acoustic said:


> You Frankenstein'd the **** outta that card LOL. I love it


Yup, bought it without warranty anyway. YOLO.
It works tho. 60c core 75c hotspot 80c VRAM Junction in Metro Exodus Enhanced all max no DLSS ray tracing max bouncing off the power limit constantly. It throttles all the way down to ~1950-1980Mhz due to power limits unfortunately. Normally I run 2100Mhz if it doesn't power throttle like in Battlefield 4. Memory +1400.

If I remove the backplate and touch the PCB above the VRM it's barely lukewarm so. Should be fine.


----------



## fray_bentos

Shadowdane said:


> Yah I ultimately ended up under-volting my MSI 3080 Suprim X as well didn't want to drop clocks that low though I went with 0.975v at 2010Mhz. It tends to settle down 1 bin lower at around ~1995Mhz though once the card gets a little bit warmer. At least that was the minimum voltage I could run the card to keep that clock speed stable, which could of course vary per GPU. My card was a stock 430W bios so it dropped me down to around ~350W average depending on the game & GPU utilization.
> 
> View attachment 2516153


Indeed, mine at stock will eventually drop back to 1860 MHz under heavy RTX load (e.g. Port Royale stress test) due to temps downstepping the frequency (with fan rpm capped at a noise level I am happy with). So I am not losing much at 1800 MHz set, 1830 MHz get, other than 80-100 Watts and lots of fan noise. As you have identified, the trick is to get that balance where the voltage/power is just low enough to avoid temp-induced downstepping at a fan noise you are happy with.

I also have a 1730 MHz profile for even lower power usage in older less demanding games where I am hiting my 165 fps cap. However, I am increasely finding myself not using the low-power profile, and instead using DSR (dynamic scaling resolution) to pump up the render resolution until my FPS drops to about 120fps average. Using DSR often gives better image quality and antialiasing than most in-game AA implementations (at my 1440p screen resolution). So far, I've used DSR with great success in games such as GTA V, F1 2018, COD Infinite Warfare. In Project Cars 2, I've eliminated aliasing and shimmering by pumping up the in-game supersampling (rather than using AA or DSR), which is essentially the same as DSR, but performed in-game. Basically, I am brute forcing image quality where I have headroom.


----------



## fray_bentos

Imprezzion said:


> And yes, those are the cut off heatpipes of the stock cooler as that part only cools the VRM and does just fine without the heatpipes. Core is the EVGA FTW3 Hybrid block, front VRM is also cut off cooler and VRAM is part of the waterblock.


But why cut off, they don't look in the way? Was that done by the previous owner?


----------



## Imprezzion

fray_bentos said:


> But why cut off, they don't look in the way? Was that done by the previous owner?


On that side it would've been fine, correct. But I had to remove the whole core block and section for the waterblock to fit in between the halfs and I planned to put the FTW3 Hybrid shroud on it which is why I had to cut this part as well. Shroud ended up not fitting so yeah. Now it has no shroud and just 2 120mm Alpenfohns Wing Boost 3 on it which cools the PCB components.


----------



## fray_bentos

Imprezzion said:


> On that side it would've been fine, correct. But I had to remove the whole core block and section for the waterblock to fit in between the halfs and I planned to put the FTW3 Hybrid shroud on it which is why I had to cut this part as well. Shroud ended up not fitting so yeah. Now it has no shroud and just 2 120mm Alpenfohns Wing Boost 3 on it which cools the PCB components.


Oh. I guess you don't value resale value.


----------



## Imprezzion

fray_bentos said:


> Oh. I guess you don't value resale value.


None whatsoever. All of my previous cards that I modded to heck and back are sold to friends and every single one of them still runs. They put up with it hehe..


----------



## joyzao

Guys

Where can I get a bios for the rtx 3080 ftw ultra 500w or 450 and with resi bar?


----------



## DStealth

https://forums.evga.com/EVGA-GeForce-RTX-3080-FTW3-XOC-BIOS-m3118560.aspx


Here


----------



## MikeS3000

I've owned a Gigabtye RTX 3080 Gaming OC since November. The memory temps sit in the upper 90s while gaming. Just because I wanted to play around with it, I tried mining for a day. I don't think I want to continue mining. However, this opened my eyes to the poor design and heat dissipation of the memory as temps averaged 105 C. Is it worth it to spend $30 to replace the thermal pads with better ones and add pads to the back of the card if I just plan on gaming?


----------



## Garrett1974NL

MikeS3000 said:


> I've owned a Gigabtye RTX 3080 Gaming OC since November. The memory temps sit in the upper 90s while gaming. Just because I wanted to play around with it, I tried mining for a day. I don't think I want to continue mining. However, this opened my eyes to the poor design and heat dissipation of the memory as temps averaged 105 C. Is it worth it to spend $30 to replace the thermal pads with better ones and add pads to the back of the card if I just plan on gaming?


I would think so yes, people have reported 20+ degrees LOWER temps when replacing the thermal pads.
Just measure their thickness and order some Gelid GP-Extreme pads, they will do just fine, let us know how it works out for you


----------



## MikeS3000

Garrett1974NL said:


> I would think so yes, people have reported 20+ degrees LOWER temps when replacing the thermal pads.
> Just measure their thickness and order some Gelid GP-Extreme pads, they will do just fine, let us know how it works out for you


Funny you mention that as I had 2 packs of Gelid GP-Extreme pads in my Amazon shopping cart last night but didn't pull the trigger. Fortunately there are some forum posts and YouTube videos on my exact card and it sounds like I need an 85x45 pack of 2.0mm and a pack of 3.0mm. Does that sounds like enough material to cover both sides of the card?


----------



## Imprezzion

MikeS3000 said:


> Funny you mention that as I had 2 packs of Gelid GP-Extreme pads in my Amazon shopping cart last night but didn't pull the trigger. Fortunately there are some forum posts and YouTube videos on my exact card and it sounds like I need an 85x45 pack of 2.0mm and a pack of 3.0mm. Does that sounds like enough material to cover both sides of the card?


The backplate even has guide lines on the underside of where to place the pads, Gigabyte just didn't bother to put them there. Then again, the backplate is super thin and doesn't have a lot of thermal mass so how much it helps? I don't know. I removed mine and put on a Arctic Accelero IV backplate. That did help about 5-7c.


----------



## Panchovix

MikeS3000 said:


> I've owned a Gigabtye RTX 3080 Gaming OC since November. The memory temps sit in the upper 90s while gaming. Just because I wanted to play around with it, I tried mining for a day. I don't think I want to continue mining. However, this opened my eyes to the poor design and heat dissipation of the memory as temps averaged 105 C. Is it worth it to spend $30 to replace the thermal pads with better ones and add pads to the back of the card if I just plan on gaming?


When I had my 3060Ti, it was the Gigabyte Gaming OC Pro, and man, I had to change the pads at some point and for some reason they were like gum or something like that, and leaking some type of liquid, like, they were one of the worst pads I've seen ever lol.

Had it from December but sold it like 1 month ago, so since you have one near that, I suggest you to change the pads, I used some Minus8 pads I think because it was just GDDR6, not GDDR6x and did the job ton better than stock pads. In your case with Gelid extreme you should be fine


----------



## MikeS3000

Panchovix said:


> When I had my 3060Ti, it was the Gigabyte Gaming OC Pro, and man, I had to change the pads at some point and for some reason they were like gum or something like that, and leaking some type of liquid, like, they were one of the worst pads I've seen ever lol.
> 
> Had it from December but sold it like 1 month ago, so since you have one near that, I suggest you to change the pads, I used some Minus8 pads I think because it was just GDDR6, not GDDR6x and did the job ton better than stock pads. In your case with Gelid extreme you should be fine


I'm getting the pads delivered tonight so should be a fun little project. I'm very curious to see the results.


----------



## felix121

MikeS3000 said:


> I'm getting the pads delivered tonight so should be a fun little project. I'm very curious to see the results.


Let us know how it goes planning on ordering the same pads....


----------



## Thedarkjester

derm said:


> Does anyone know what the T6 screws that go into the leaf spring are? One of mine is almost stripped and I'd like to replace them, but I have no idea what screws to buy, or if they are even sold anywhere.


Did you ever figure this out?


----------



## MikeS3000

felix121 said:


> Let us know how it goes planning on ordering the same pads....


I just finished the job. First off the 80x40 2.0mm pad is not enough to do all of the thermal pads. It was enough to do the memory and had a little left over. The 3.0mm 80x40 pad is large enough to do the backplate. I tried some mining and previous temps peaked to around 106C before the replacement, now I am maxing out at 76C! I’m pretty happy with the results but wish I could have replaced all of the greasy Gigabyte pads.


----------



## ViTosS

Are you guys getting the boost you set in the v/f curve in MSI AB? I don't know if it's new BIOSes or drivers that I updated, but I had it noted that I could run Portal Royal with 2085Mhz and it would stay like that from the beginning till the end of the benchmark, maybe dropping fast to 2070 and going back to 2085Mhz, but now if I set the same 2085Mhz I get like 2040-2055Mhz and even sometimes 2025Mhz, what changed to cause that?


----------



## Imprezzion

What's the perf limit reason in GPU-Z? 
And did you compare effective clock before and after? It might just have been over reporting previously. Effective clock is quite different then what MSI AB reads out.


----------



## ViTosS

Imprezzion said:


> What's the perf limit reason in GPU-Z?
> And did you compare effective clock before and after? It might just have been over reporting previously. Effective clock is quite different then what MSI AB reads out.


Hmm don't know what you mean, but my PerfCap Reason is showing as IDLE, I don't know how to check effective clock but judging by MSI overlay it's not like before, not showing the clock I set, it's like 30Mhz lower sometimes and it wasn't like that before.


----------



## Imprezzion

Effective clock is the actual clock speed the core runs, not what it reports. HWINFO64 can monitor this for example.

If I run 1980 core effective is around 1930-1940 for example. The difference between reported and effective gets bigger if you hit power limit.

How can it give IDLE as perfcap when you're running a benchmark lol..


----------



## ViTosS

Imprezzion said:


> Effective clock is the actual clock speed the core runs, not what it reports. HWINFO64 can monitor this for example.
> 
> If I run 1980 core effective is around 1930-1940 for example. The difference between reported and effective gets bigger if you hit power limit.
> 
> How can it give IDLE as perfcap when you're running a benchmark lol..


Actually I checked while playing a game using 40% of GPU, how do I check it in the benchmark or game without having to alt+tab to check it?


----------



## Imprezzion

ViTosS said:


> Actually I checked while playing a game using 40% of GPU, how do I check it in the benchmark or game without having to alt+tab to check it?


Second monitor or enable logging to a txt file in HWINFO64 or GPU-Z ( Collecting GPU logs using GPU-Z | NVIDIA.) I guess?


----------



## ViTosS

Imprezzion said:


> Second monitor or enable logging to a txt file in HWINFO64 or GPU-Z ( Collecting GPU logs using GPU-Z | NVIDIA.) I guess?


It showed PerfCap Reason ''16'' in all the log till the end of benchmark, don't know what that means


----------



## Garrett1974NL

MikeS3000 said:


> I just finished the job. First off the 80x40 2.0mm pad is not enough to do all of the thermal pads. It was enough to do the memory and had a little left over. The 3.0mm 80x40 pad is large enough to do the backplate. I tried some mining and previous temps peaked to around 106C before the replacement, now I am maxing out at 76C! I’m pretty happy with the results but wish I could have replaced all of the greasy Gigabyte pads.


I was away for a few days so apologies for the late response.
I see you already replaced the pads and you gained, or should I say LOST lol..., 30 degrees, told you eh?
You don't happen to have any photos of it?
I would definitely make photos if I were to do it myself


----------



## Panchovix

Imprezzion said:


> How can it give IDLE as perfcap when you're running a benchmark lol..


It can happen if you undervolt to never surpass the TDP limit.

You won't be Pwr limited since you won't reach the max TDP.
You won't be VRel/VOp limited since you're undervolting.
And probably you won't be Thermal limited if you're undervolting.
So that just leave one PerfCap reason, which is Idle or "No load", and it shows that only because there isn't other limit that would be shown there.

I can confirm this happens when I had my 3060Ti and my actual 3080, both undervolted, playing any taxing game with 99% GPU usage.


----------



## Imprezzion

I didn't think that far haha. My poor 2x8pin card is literally always PWR as even undervolted at 0.987v it throttles with +90 core clock. I just leave it at stock volts in MSI AB with +90 core +1400 memory and the power throttling can take care of what actual clocks and voltages it wants to run. Usually around 1935-1950 @ 0.962-0.975v.


----------



## Panchovix

Imprezzion said:


> I didn't think that far haha. My poor 2x8pin card is literally always PWR as even undervolted at 0.987v it throttles with +90 core clock. I just leave it at stock volts in MSI AB with +90 core +1400 memory and the power throttling can take care of what actual clocks and voltages it wants to run. Usually around 1935-1950 @ 0.962-0.975v.


Mine is a TUF so it's 2x8 as well haha, but I just use 1905Mhz at 0.9V mostly, for "overclock" I do 2055Mhz at 0.975V and that's about it, more Voltage and I get power limited lol


----------



## Astral85

Did anyone experience low performance from their 3080's when they got them? I just got my ROG Strix 3080 up and running today and the performance seems really poor, perhaps worse than my 2080 Ti and not at all the performance boost I was expecting. I'm not sure what the issue is, HWiNFO is correctly reading the card in PCIE 3.0 16x mode. The card boosts to around 2025Mhz. I clean DDU the drivers.


----------



## Nizzen

Astral85 said:


> Did anyone experience low performance from their 3080's when they got them? I just got my ROG Strix 3080 up and running today and the performance seems really poor, perhaps worse than my 2080 Ti and not at all the performance boost I was expecting. I'm not sure what the issue is, HWiNFO is correctly reading the card in PCIE 3.0 16x mode. The card boosts to around 2025Mhz. I clean DDU the drivers.


In what game and resolution?

In some games in 1080p, 2080ti can be faster than 3080  Typical high fps games.


----------



## Phantomas 007

I need help about my matter. I had ordered many months now a Asus GeForce RTX 3080 10GB ROG Strix OC. But because of the delay i have 3 options:

1) To continue wait

2) Gigabyte Aorus GeForce RTX 3080 Xtreme 10G 

3) Asus GeForce RTX 3080 Ti 12GB TUF Gaming OC

The obvious choice is the 3 with 12GB but it is ?


----------



## Imprezzion

It's 97% of a 3090 of course it is..


----------



## SoldierRBT

Astral85 said:


> Did anyone experience low performance from their 3080's when they got them? I just got my ROG Strix 3080 up and running today and the performance seems really poor, perhaps worse than my 2080 Ti and not at all the performance boost I was expecting. I'm not sure what the issue is, HWiNFO is correctly reading the card in PCIE 3.0 16x mode. The card boosts to around 2025Mhz. I clean DDU the drivers.


I tested my 3080 FTW3 (450W) a few days ago and it's around 15-20% slower than my 3090 KPE (520W) in games. Do you think your 3080 is underperforming? Can you run Port Royal or Time Spy and share results?


----------



## Panchovix

Phantomas 007 said:


> I need help about my matter. I had ordered many months now a Asus GeForce RTX 3080 10GB ROG Strix OC. But because of the delay i have 3 options:
> 
> 1) To continue wait
> 
> 2) Gigabyte Aorus GeForce RTX 3080 Xtreme 10G
> 
> 3) Asus GeForce RTX 3080 Ti 12GB TUF Gaming OC
> 
> The obvious choice is the 3 with 12GB but it is ?


3080Ti of course, at 4K it will be 10-15% better than the 3080


----------



## Astral85

Nizzen said:


> In what game and resolution?
> 
> In some games in 1080p, 2080ti can be faster than 3080  Typical high fps games.


AC: Valhalla @ 3440x1440 90 FPS limit.


----------



## Astral85

SoldierRBT said:


> I tested my 3080 FTW3 (450W) a few days ago and it's around 15-20% slower than my 3090 KPE (520W) in games. Do you think your 3080 is underperforming? Can you run Port Royal or Time Spy and share results?


Yes I will run some 3D Mark tests later today.


----------



## Astral85

@SoldierRBT 

Timespy GPU score is definitely up over the 2080 Ti. I tried AC: Valhalla again today and got much better results than last night. Not sure what happened last night but maybe it was the Geforce Exp optimized settings I took for Valhalla or maybe DOF at high or something. It was the first run of the new card after working on the custom loop all day so I could have overlooked something. 

I'm seeing the 3080 really starting to perform now I've opened up the power limit and voltage. I've got the core running at 2130Mhz with +80 on voltage slider. GPU core runs at 37-38C under the EK Strix waterblock while playing. Have not yet touched memory. Happy with these results so far...

I don't know what it is with AC:V but it's Anti Aliasing is extremely demanding. I was surprised to see the FPS take a hit with the 3080 on medium AA. I think the AA in this game is just overly demanding. Furthermore this an AMD sponsor game and performs better on AMD GPU's. With similar game settings to what I had been playing with the 2080 Ti I can see the 3080 is definitely performing stronger...


----------



## Phantomas 007

Imprezzion said:


> It's 97% of a 3090 of course it is..





Panchovix said:


> 3080Ti of course, at 4K it will be 10-15% better than the 3080


ok!

About the Gigabyte Aorus GeForce RTX 3080 Xtreme 10G it's a good choice for RTX3080 ? I see the cooling system it's bigger than the 3090.


----------



## SoldierRBT

@Astral85 

Your score is perfectly fine for a 3080. AC games aren’t well optimized. Every time I play one the wattage use is low compare to other games. You should test Battlefield V, Metro Exodus or RTX Quake 2 and then compare results to your 2080 Ti. 

Max out the voltage and power sliders and start with +500 memory and see how it performs. There’s a point where memory OC decreases performance if it’s too high.


----------



## Imprezzion

I really wonder if I should remount the waterblock of the EVGA Hybrid again.. reported temps are fine but hotspot has way too much deviation for my liking..
Highest values after an evening of gaming are 60.8c reported core and 77.3c hotspot.. (VRAM Junction 76c). That's 16.5c deviation..


----------



## acoustic

That poor hybrid cooler is just crying to be left alone LOL


----------



## Imprezzion

acoustic said:


> That poor hybrid cooler is just crying to be left alone LOL


Yeah I remounted 4 times already in the past as hotspot was 25-30c off at first. It was a VRAM pad that folded on itself and lifted a corner of the block. But even now it's not great lol. 

Liquid metal time maybe? Hehe


----------



## acoustic

My hotspot was not good either with the 3080FTW3 and hybrid. It wasn't terrible, but with the EK block I get max 8c differential between GPU Temp and Hotspot. I don't know why you don't just do a soft-tube loop with a full block, and stop messing with that child's play hybrid cooler lol


----------



## Imprezzion

Mostly money. I'd need a new pump, at least 2 new rads, a new CPU block (ok I technically can re-use this Supremacy but...), A full cover GPU block, fittings, tubing.. that is all together around €600 compared to like €120 that the Hybrid cost me.


----------



## FedericoUY

Hello. Is anyone undervolting their 3080's? I'm having a weird situation, where I set my curve at [email protected] and it jumps to 2055. If I set at less voltage, for example at 2010 it jumps to 2025. All the time when thermals are going ok, it goes to the upper strap (and curve still shows the dot where I set it). Never seen this. Should be a afterburner thing?


----------



## blurp

FedericoUY said:


> Hello. Is anyone undervolting their 3080's? I'm having a weird situation, where I set my curve at [email protected] and it jumps to 2055. If I set at less voltage, for example at 2010 it jumps to 2025. All the time when thermals are going ok, it goes to the upper strap (and curve still shows the dot where I set it). Never seen this. Should be a afterburner thing?


Make sure you set your curve after the gpu have been idle for a while.


----------



## fray_bentos

FedericoUY said:


> Hello. Is anyone undervolting their 3080's? I'm having a weird situation, where I set my curve at [email protected] and it jumps to 2055. If I set at less voltage, for example at 2010 it jumps to 2025. All the time when thermals are going ok, it goes to the upper strap (and curve still shows the dot where I set it). Never seen this. Should be a afterburner thing?


If the temps are low enough a 3080 will boost one (15 MHz) or two (30 MHz) frequency bins above what is specified on the voltage curve. Hence, if your temps are good, you'll always GET 30 MHz above the frequency that you SET (if you set your curve at idle). For pinning down a stable overclock, I have found it most reliable to note down the voltage / SET frequency and adjust frequency up or down from that base, but only with the GPU at idle temperatures.


----------



## FedericoUY

I'm setting all at idle with windows just started, but still jumps to the next strap. I have a pretty aggressive fan curve, so when it reaches 60 it is at 100%. What speed are you reaching in vram? I went all the way up to 11k (+1500) and still stable. I'm impressed with OC capabilities on this card!


----------



## Imprezzion

Did you test with a FPS monitor on in a heavy game standing still? It has ECC on the memory so it won't crash but it will reduce performance.

The way I tested it was Division 2 standing still at the spawn point in the base of operations outside and then watching FPS. Raise +100 memory, if FPS is same or higher it's fine to test further, if FPS drops, too high and it's correcting.

+1500 is definitely possible, but not very common. Mine does +1200 fully stable (at 80c VRAM Junction) and it can do +1400 but it has random drops then.

Btw guys, at which Hotspot temperature should I be getting worried? I know my mount ain't perfect but it's a LOT of work to pull the block off again.

At 27c ambient I'm at 62-63c core and 78-80c hotspot with reasonable fan speeds. Is that still good enough? I know the temp delta is kinda big but as I said, remounting is a LOT of work.

Those temps are with +90 core +1200 memory slamming straight into the power limit at 350-355w measured over 2+ hours of gaming. Usual clocks around 1950-1980 @ 0.962-0.987v due to throttling. It's set to 2100 @ 1.100v.


----------



## fray_bentos

FedericoUY said:


> I'm setting all at idle with windows just started, but still jumps to the next strap. I have a pretty aggressive fan curve, so when it reaches 60 it is at 100%. What speed are you reaching in vram? I went all the way up to 11k (+1500) and still stable. I'm impressed with OC capabilities on this card!


Yes, the frequency will always jump up one or two straps, irrespective of load, it depends on temperature. The lower the temperature, the more likely the jump.

As you increased memory clock did you check your benchmark scores at the same time? 3080 VRAM has memory correction, so you can pump clocks high, but error correction kicks in meaning that your performance benefit is either greatly limited, or actually negative if you go too high.

I tried memory OC early-on, but soon gave up when I saw that performance scaling was almost non-existent. Any performance change was not linearly scaling with the % frequency change (i.e. indicates memory correction is kicking in). Memory on my FE seemed basically pushed as far is it can go already. Memory OC also eats into your over GPU power budget, results in more heat generation, and runs the risk of early failure (remember the 2080 Ti memory issues?). I just don't think memory OC is worth it on a 3080.

Edit: just saw @Imprezzion beat me to it. Our posts say similar things, snap!


----------



## FedericoUY

Great, thank you both, new on this arch and really didn't know that vram has ECC, so well, what are you using to fast test the memory? Port royale? Something faster? So you know you reached the best config when the scores are not bumping anymore? Do you have and way to check if error corrections has been triggered? You never get an artifact or a plain error when over overclocking (or going beyond the limits) on the vram?
Thanks for your answers!


----------



## Imprezzion

I first do the quick and dirty test with a game standing still and after I found a spot in the clocks where FPS still scales and doesn't drop (same as frame times, monitor those as well) after that I run TS Extreme a few times with different memory clocks to see if scores still scale. 

This is how I got to my +1200. +1300 has the same scores and +1400 drops slightly. 

Core, I can do way way more but 2x8 pin power limit so yeah. Can't effectively get over ~1980Mhz. It won't do 2000+ under 0.987v unfortunately.


----------



## acoustic

FedericoUY said:


> Great, thank you both, new on this arch and really didn't know that vram has ECC, so well, what are you using to fast test the memory? Port royale? Something faster? So you know you reached the best config when the scores are not bumping anymore? Do you have and way to check if error corrections has been triggered? You never get an artifact or a plain error when over overclocking (or going beyond the limits) on the vram?
> Thanks for your answers!


I use Heaven Benchmark run in windowed mode. Freeze the benchmark (it's still actively rendering) and I adjust my memory clock until the framerate either A) does not increase anymore or B) begins to decrease

I usually then knock 100-200Mhz off that number and leave it alone. Quite honestly the memory is so fast with GDDR6X that I don't think it's much of a bottleneck anyway, even at 4K. I would rather have the power limit freed up to allow the core to go higher.


----------



## Panchovix

FedericoUY said:


> Great, thank you both, new on this arch and really didn't know that vram has ECC, so well, what are you using to fast test the memory? Port royale? Something faster? So you know you reached the best config when the scores are not bumping anymore? Do you have and way to check if error corrections has been triggered? You never get an artifact or a plain error when over overclocking (or going beyond the limits) on the vram?
> Thanks for your answers!


In my case did Unigine Superposition at 1080p Extreme and the "in-game" mode or something like that, where you can leave the camera in a still position and you can see any change. Tested on Control with RTX without DLSS as well.

I started with +100Mhz until +1000Mhz, and then +50Mhz until it wasn't stable or reduced performance on my 3080 TUF.

For example in my case, I get more performance until +1600Mhz (+1650Mhz to be exact), at +1700Mhz I start see performance decrease, and at +1750Mhz I get some crashes (yes, a crash with GDDR6X). At +1800Mhz basically is insta crash for me.

Though, I do +1500-1600Mhz on benchmarks mostly, I use +1000Mhz usually with an undervolt (either 1920Mhz at 0.9V, or 2055Mhz at 0.975V, can't go higher coz 2x8 pin)


----------



## FedericoUY

Great, thanks for the answers. @Panchovix I also usually tend to do quick 1080p extreme tests with superposition to check for stability, I find 3dmark too long. Will start testing out vram tonight from +0 and up. I also undervolt my card, the chip behaves great, currently testing with bf V, cod mw and control games. 925mv @ 2025mhz (which generally spikes to 2040). Very happy with it, even after testing a 3090 ftw3. 
Is it possible to check error corrections of the gddr6x via hwinfo?


----------



## Panchovix

FedericoUY said:


> Is it possible to check error corrections of the gddr6x via hwinfo?


Not that I know sadly.



FedericoUY said:


> 925mv @ 2025mhz (which generally spikes to 2040)


That's quite a silicon winner lol


----------



## Imprezzion

How are you all getting 2000+ clocks under 1v like.. I need a minimum of 1.043v to even stand a chance above 2025Mhz and the best I've been able to run in games that don't power limit like BF4 and World of Tanks limited to 280FPS monitor refresh rate + GSync is like 2055 @ 1.087 to 2070 @ 1.100v.. anything higher instantly crashes with a DirectX error lol.. Stuff like Cyberpunk 2077 with Performance DLSS (also does not throttle thanks to DLSS reducing load enough not to) can't even run that and at best 2040-2055 @ 1.087-1.100v is stable. Any more MHz or less voltage results in freezes or DirectX crashes.

What I do find weird it that my card seems to have a voltage / frequency "wall" if you know what I mean.

I can do 1965 (sometimes boosts to 1980) @ 0.987v just fine and it's been stable for months. Never had a crash in 200+ hours of the heaviest games like CP, Metro Enhanced, BFV, Ghostrunner, Division 2. But anything higher like 2010Mhz isn't stable under 1.043v at all.


----------



## Panchovix

Imprezzion said:


> How are you all getting 2000+ clocks under 1v like.. I need a minimum of 1.043v to even stand a chance above 2025Mhz and the best I've been able to run in games that don't power limit like BF4 and World of Tanks limited to 280FPS monitor refresh rate + GSync is like 2055 @ 1.087 to 2070 @ 1.100v.. anything higher instantly crashes with a DirectX error lol.. Stuff like Cyberpunk 2077 with Performance DLSS (also does not throttle thanks to DLSS reducing load enough not to) can't even run that and at best 2040-2055 @ 1.087-1.100v is stable. Any more MHz or less voltage results in freezes or DirectX crashes.
> 
> What I do find weird it that my card seems to have a voltage / frequency "wall" if you know what I mean.
> 
> I can do 1965 (sometimes boosts to 1980) @ 0.987v just fine and it's been stable for months. Never had a crash in 200+ hours of the heaviest games like CP, Metro Enhanced, BFV, Ghostrunner, Division 2. But anything higher like 2010Mhz isn't stable under 1.043v at all.


I guess it is silicon lottery, on my 3080 TUF basically any modern game will power limit the card.


This is the best I could get on TimeSpy for example I scored 17 637 in Time Spy, and I still get power limited in the 2nd test. (19383 Graphics score, 2055Mhz at 0.975V), it goes to 2070Mhz if the card is cold, but it's unstable on some games (specially RTX ones without using DLSS).

At 2040Mhz/0.975V, It is 100% stable (for example control with RTX without DLSS, CP2077, ME: Enhanced Edition, etc)

I remember doing a test months ago on BF4 where I could max the card voltage without being power limited, and it did reach 2175Mhz, so probably the chip in this TUF card is decent









But anyways, since the 3080 is too powerful already, I just leave it at 1920Mhz/0.9V and +1000Mhz mem OC


----------



## Imprezzion

Panchovix said:


> I guess it is silicon lottery, on my 3080 TUF basically any modern game will power limit the card.
> 
> 
> This is the best I could get on TimeSpy for example I scored 17 637 in Time Spy, and I still get power limited in the 2nd test. (19383 Graphics score, 2055Mhz at 0.975V), it goes to 2070Mhz if the card is cold, but it's unstable on some games (specially RTX ones without using DLSS).
> 
> At 2040Mhz/0.975V, It is 100% stable (for example control with RTX without DLSS, CP2077, ME: Enhanced Edition, etc)
> 
> I remember doing a test months ago on BF4 where I could max the card voltage without being power limited, and it did reach 2175Mhz, so probably the chip in this TUF card is decent
> View attachment 2517416
> 
> 
> But anyways, since the 3080 is too powerful already, I just leave it at 1920Mhz/0.9V and +1000Mhz mem OC
> View attachment 2517417












This is all it can do. Won't get any better. 2100 is unstable on BF4 even at the full 1.100v. This way with this curve it will at least run 2070Mhz cold (effective 2040 after temperature) at a voltage it is stable enough but in most games I play it sits around the 1980 @ 0.981v mark which is the best it'll do without throttling.

Control with RTX and no DLSS was a disaster as well. It can't even get 1935 in that game. I've seen it drop into the 18xx several times lol.


----------



## Panchovix

Imprezzion said:


> Control with RTX and no DLSS was a disaster as well. It can't even get 1935 in that game. I've seen it drop into the 18xx several times lol.


Yep, RTX games without DLSS tend to kill most of the undervolts/overclocks, so that's why I use them for testing haha

A bummer though, it seems your chip isn't bad by any means, probable average, but the power limit is limiting the card a lot


----------



## SoldierRBT

People that went from 3080 stock air cooler to waterblock. How much improvement in VRAM OC did you get?


----------



## mouacyk

lotz of mhz


----------



## fray_bentos

Imprezzion said:


> View attachment 2517424
> 
> 
> This is all it can do. Won't get any better. 2100 is unstable on BF4 even at the full 1.100v. This way with this curve it will at least run 2070Mhz cold (effective 2040 after temperature) at a voltage it is stable enough but in most games I play it sits around the 1980 @ 0.981v mark which is the best it'll do without throttling.
> 
> Control with RTX and no DLSS was a disaster as well. It can't even get 1935 in that game. I've seen it drop into the 18xx several times lol.


Indeed, this is why I recommend just setting a capped frequency of around 1830 MHz with a voltage of 800-850 mV (depending on silicon quality), which keeps temps low and allows the +30 MHz (two bin) boost to kick in (to give 1860 MHz). The result is essentiallly the same performance in RTX loads, but with less power, heat, noise than trying to force an additional ~100 mV load voltage and still ending up at almost the same frequency.

Games that place less load on the GPU do give higher frequencies, but this rarely translates into an improved gaming experience; such games are typically already outputting at >120 FPS, which is the point where many games begin to encounter CPU/RAM bottlenecking, or hit monitor maximum refresh rates (I have a 165 Hz monitor).

Ultimately, we are all limited by the same fundamental issues. Increasing power available even when paired with increasing cooling capacity doesn't result in worthwhile gains vs. the downsides: disproportionate cost (high powerlimit card/water cooling), noise, and heat. These same fundamental limitations also explain why a 3090 has 21% more cuda cores than a 3080, yet only attains 0-15% more performance.


----------



## acoustic

That 0-15% stretches at higher resolutions, though, no? I know I've seen examples of the 3090 (or even the 3080TI, now) where it's 20%+, especially on the higher power limit cards, but it's really only at high resolutions.

I wish they'd give us a 475-500watt BIOS for the 3080TI, but oh well. Lol


----------



## fray_bentos

acoustic said:


> That 0-15% stretches at higher resolutions, though, no? I know I've seen examples of the 3090 (or even the 3080TI, now) where it's 20%+, especially on the higher power limit cards, but it's really only at high resolutions.
> 
> I wish they'd give us a 475-500watt BIOS for the 3080TI, but oh well. Lol


Yes, that's right, but some of that is due to increased memory bandwidth on the 3090, which is obviously more important at higher resolutions.


----------



## acoustic

Yeah. The memory bandwidth of the 3080 is no slouch either, though. I don't think the cores scale very well below 4K; just seems like the 3080TI/3090 need that higher resolution to stretch the legs a bit. Not entirely sure how much of that is due to memory bandwidth. I suppose you could downclock the memory on the TI/3090 to see. Might be an interesting comparison.


----------



## Astral85

I started playing Ghostrunner with my new Strix 3080 OC, everything maxed, RTX ON, DLSS performance. The first couple of levels ran really smoothly at around 90-100 FPS and looked great. Now I'm at level 3 "The Climb" and it's a real stutter fest.  My searches seem to indicate RTX as an issue on later levels? Does anyone have any experience with this game and RTX on?


----------



## fray_bentos

acoustic said:


> Yeah. The memory bandwidth of the 3080 is no slouch either, though. I don't think the cores scale very well below 4K; just seems like the 3080TI/3090 need that higher resolution to stretch the legs a bit. Not entirely sure how much of that is due to memory bandwidth. I suppose you could downclock the memory on the TI/3090 to see. Might be an interesting comparison.


Indeed, I run 1440p and I have become quite a fan of DSR to pump up rendering resolutions to take advantage of the higher-resolution scaling headroom to increase image quality. It works quite well on older games, and really cleans up image quality in games with poor AA implementations (brute force AA). It's especially nice on things like racing games where you often look far into the distance and there are often lots of hard edges from track edges/lines etc, or games with lots of spiky grass etc. However, I target 100-120 fps and newer games don't allow me to get there even at 1440p, though I am often close to maxing out settings. Given that I favour high framerates, I still don't think we quite are at the level of GPU performance where a high-refresh 4K screen can be fully exploited in all games.


----------



## Mystic33

I share my 3080 evga ftw3 ultra best score in superposion benchmark with the 450W bios on air @2250mhz


----------



## Imprezzion

I run 1080p 280Hz (HDR on sometimes) on a ASUS VG279QM.

I went for a few tests using CP2077 with RTX Psycho @ all max 1080p no DLSS. I run the EVGA XC3 non-ReBAR BIOS as I use a EVGA Hybrid cooler and this BIOS lets me adjust fan and pump way better then Gigabytes. There's no difference in clocks, FPS, benchmark scores or stability between the two.

Weird stuff happens. If I just set max power limit and max voltage sliders and use +90 offset and let the card / BIOS do the throttling it ends up around 1935Mhz ish at 0.950-0.962v.

If I manually make a curve I can let it run 1.000-1.018v at the same clocks and it doesn't hit the power limit somehow. It also let's it go higher. If the card does the work it hits about 345w before it throttles. Manual curve let's it hit 355-360w and it still does not throttle. Max power in HWINFO64 was 364.5w and still, no throttling even tho this BIOS maxes at 366w.

I am now just sitting still AFK in CP2077 testing some clocks and I'm testing manual curve 1965 @ 0.950v. It runs around 330-335w now and temps are noticably lower on both core and hotspot by about 4-5c. I'm going to the shops for some groceries and i'l see if it still runs and hasn't crashed when I come back lol.

Effective clocks are also higher on manual curve then they are on offset..

EDIT: Still runs.. Boosts to 1995 @ 0.962v most of the times sometimes dropping to 1980 @ 0.950v for a second but doesn't really throttle as bad anymore and is much cooler now. Before on offset it ran 62-63c core 79-80c hotspot. (I know, I know, the delta.. mount ain't perfect..)

Effective clocks sit around 1950-1956Mhz which is much higher then it was with offset. That was more like 1910-1925 even with the same reported clocks.










EDIT2: Even lower: 1935 @ 0.918v. Barely goes over 300w now. So, this is a bit too low and costing performance at this point but it's nice that it CAN do it.


----------



## Clukos

SoldierRBT said:


> People that went from 3080 stock air cooler to waterblock. How much improvement in VRAM OC did you get?


From +1000 24/7 stable to +1800 24/7 stable.

3080 FE stock to 3080 FE + EK Vector FE block + thermalright odyssey pads. Memory temp dropped from 110C max to 54C max.


----------



## Astral85

Here is my Timespy score with new ROG Strix 3080 OC on an EK Vector.


----------



## Imprezzion

Impressive. My best (graphics) score is 17688 cause of power throttling on 2x8 pin. I just cannot squeeze any more out of it. My BIOS (EVGA XC3) has 366w and it bounced off 363w the whole time at like 2025 ish MHz @ 1.018v.

I did find a new sweet spot thanks to you guy's advice of using a curve to limit the voltage. 

It's been stable for 2 evenings of gaming at 1950Mhz @ 0.925v set and it no longer throttles at all. It usually even boosts to 1965MHz after 30-60 seconds and then while gaming it switches between 1950-1965 the whole time depending on temperature. Power draw sits around 325-330w in the heaviest games so right at 98-101% which is fine as it has 109% to run with.


----------



## Panchovix

Imprezzion said:


> My best (graphics) score is 17688 cause of power throttling on 2x8 pin. I just cannot squeeze any more out of it.


I think you can actually squeeze more of it, or I hope so at least.

For example, this is my score with a TUF 3080 (2x8 pin as well)








Here is the link for the run: I scored 17 637 in Time Spy
And this is my curve for this bench (take note that it doesn't even reach 0.987V on TimeSpy (it does in games), it just does 2055Mhz at 0.975V; it's 2085Mhz at 0.987, 2070Mhz at 0.981 and 2055Mhz at 0.975V)








With this mem OC (and fans at 100% speed, for benchmarks only)








If you try something similar, you may reach above 18K?


----------



## Nizzen

Mystic33 said:


> I share my 3080 evga ftw3 ultra best score in superposion benchmark with the 450W bios on air @2250mhz


Pleace post port royal


----------



## Astral85

Why are some AIB cards offering 3x8 pin PCB's? I was pulling up to 425W in Timespy. 😮


----------



## Imprezzion

Panchovix said:


> I think you can actually squeeze more of it, or I hope so at least.
> 
> For example, this is my score with a TUF 3080 (2x8 pin as well)
> View attachment 2517709
> 
> Here is the link for the run: I scored 17 637 in Time Spy
> And this is my curve for this bench (take note that it doesn't even reach 0.987V on TimeSpy (it does in games), it just does 2055Mhz at 0.975V; it's 2085Mhz at 0.987, 2070Mhz at 0.981 and 2055Mhz at 0.975V)
> View attachment 2517710
> 
> With this mem OC (and fans at 100% speed, for benchmarks only)
> View attachment 2517711
> 
> If you try something similar, you may reach above 18K?


Here.

CPU @ x51 all core with x47 cache and 4400C17 RAM.
CPU reached 68c hottest core.
Card at 1950 @ 0.925v mostly boosting to 1965. In GT2 it did drop a few times to 1935 @ 0.918v as it touched the power limit still.
Card reached 55c core, 72.5c hotspot and 74c VRAM Junction.
+1200 memory.
Latest drivers (471.xx), stock settings (G-Sync enabled).










Going to run again at +1400 memory. I know the card can do it but I put it back to +1200 after some weird unexplained NVidia driver crashes in some games. Had nothing to do with memory but never put it back lol.

EDIT +1400. Technically it IS faster. Barely.










Curve:









Effective clocks sit at 1936-1941Mhz when it runs 1965 and 1914-1918Mhz when it runs 1935 due to temperature / DLSS reducing load enough for it to not boost to 1965.


----------



## Panchovix

Pretty nice, I knew you could reach above 18k on a 2x8 pin.

IDK about my temps on my 3080 TUF except by the score, which was 44°C, probably 59°C on hotspot and 70°C on VRAM if I had to guess, but again, with fans at 100%.




Astral85 said:


> Why are some AIB cards offering 3x8 pin PCB's?


So you don't get power limited like us with humble 2x8 pins cards lol


----------



## FedericoUY

Hi all. Been testing the vram on this card. Up to 1500 the ram keeps bumping the scores.

Starting with what im using now cpu all core 5ghz (10850k), 16gb 3600ram, gpu at [email protected] - +500vram (10002mhz), the difference is that this time is not bumping up, but going between 2025 and 2040, even when I set to 100% fan speed, and temp are lower. I really don't get when it bumps up, because is not a temp related thing.
This is my today time spy test:










Testing vram (posting only superposition extreme, but did with other test aswell) with gpu [email protected], tried this configs, when reached +1500 (11k mhz) the difference was so little that I think that's about it:
+0(9502mhz):








+500(10002mhz):








+750(10252mhz):








+1000(10502mhz:








+1250(10752mhz):








+1500(11002mhz):









So I guess my best vram config for tests could be 11k (did not tried more), even knowing that, my 24/7 for gaming config is +500.
Will start testing clock speeds later.


----------



## fray_bentos

FedericoUY said:


> Hi all. Been testing the vram on this card. Up to 1500 the ram keeps bumping the scores.
> 
> Starting with what im using now cpu all core 5ghz (10850k), 16gb 3600ram, gpu at [email protected] - +500vram (10002mhz), the difference is that this time is not bumping up, but going between 2025 and 2040, even when I set to 100% fan speed, and temp are lower. I really don't get when it bumps up, because is not a temp related thing.
> This is my today time spy test:
> 
> View attachment 2517792
> 
> 
> Testing vram (posting only superposition extreme, but did with other test aswell) with gpu [email protected], tried this configs, when reached +1500 (11k mhz) the difference was so little that I think that's about it:
> +0(9502mhz):
> View attachment 2517794
> 
> +500(10002mhz):
> View attachment 2517795
> 
> +750(10252mhz):
> View attachment 2517796
> 
> +1000(10502mhz:
> View attachment 2517797
> 
> +1250(10752mhz):
> View attachment 2517798
> 
> +1500(11002mhz):
> View attachment 2517799
> 
> 
> So I guess my best vram config for tests could be 11k (did not tried more), even knowing that, my 24/7 for gaming config is +500.
> Will start testing clock speeds later.


Yes, it is temp related, the max boost bonus (above what is set on the V/F curve) is reduced by 15 MHz for every 5 C temperature increase over 40 C. On top of that, you'll also get the clock dropping if you hit your power limit (320 W stock), and/or temperature limit (83 C stock). Your results suggest that you might be, or close to hitting your power limit (not sure what your power limit is, depends on card/bios). Increasing RAM speed is also increasing your power usage and eating into your power budget while also barely affecting scores. That power, if you wish to use it, would be better spent on GPU clock, if you are OK with the increased noise and heat. I saw the same with my brief look at memory OC on the 3080 and hence don't bother with it; you can also introduce rare, hard to diagnose instability. I noted @Imprezzion also found similar above; overall not worth it.


----------



## SoldierRBT

Not bad for a 3080 on air. 1.056v locked to 2265MHz +1200 Mem 450W ReBar BIOS Max temp: 60C









I scored 13 606 in Port Royal


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## parkschance

I've been playing around with a 2x8 pin card for a little while now and this is what I've been able to reach in Time Spy so far. This is a basic reference board with a bunch of hardware mods like a Alphacool Eisblock Aurora waterblock, better and higher uF capacitors behind the chip, added caps everywhere there was a place to around the memory, added shunts to all existing ones, and recently landed on the Gigabyte Aorus Xtreme bios after testing a bunch that didn't really make much of a difference. When I changed the bios I just had to adjust the overclock some. With this bios, here is my settings within Afterburner and my Time Spy score. Any higher overclock on the core and I get crashes and any higher in the mem results less performance. I am not using the curve editor for tuning at all. These settings are 100% stable so far on any game and benchtest that I've tried. Comparing my score to others seems pretty decent, but I feel like there is more if I could completely remove the limits of the board (software and hardware). I also know my cpu is holding me back a little from getting better performance, but I'm currently having some issues with my 10 series cpu and board. 

















I scored 18 379 in Time Spy


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## FedericoUY

SoldierRBT said:


> Not bad for a 3080 on air. 1.056v locked to 2265MHz +1200 Mem 450W ReBar BIOS Max temp: 60C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 13 606 in Port Royal
> 
> 
> Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2517819


Those are great clocks and score!, can you point me to where is the bios you are using (EVGA right?) ? Is your card watercooled?


----------



## SoldierRBT

FedericoUY said:


> Those are great clocks and score!, can you point me to where is the bios you are using (EVGA right?) ? Is your card watercooled?


Card is on air. It's the 3080 FTW3 Ultra model. Not sure where you can get this BIOS since I updated it through Precision X1. GPU-Z says 94.02.42.80.31


----------



## Imprezzion

It's probably somewhere in unverified in the techpowerup database lol.


----------



## FedericoUY

SoldierRBT said:


> Card is on air. It's the 3080 FTW3 Ultra model. Not sure where you can get this BIOS since I updated it through Precision X1. GPU-Z says 94.02.42.80.31


Ok mine is same model, just did the re-bar enabling stuff, but my vbios looks like it is not the same, since last chars are different:








Are all of you runing this cards with the rebar stuff done?
Anyone on the xoc bios from evga (https://forums.evga.com/EVGA-GeForce-RTX-3080-FTW3-XOC-BIOS-m3118560.aspx) ?


----------



## acoustic

SoldierRBT said:


> Not bad for a 3080 on air. 1.056v locked to 2265MHz +1200 Mem 450W ReBar BIOS Max temp: 60C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 13 606 in Port Royal
> 
> 
> Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2517819


That is a golden chip. I have no clue how you're maintaining 1.056v through the entire run of Port Royal with only a 450watt limit, while hitting 60c. Consider yourself extremely, extremely lucky.


----------



## FedericoUY

Keeping testing clocks, I think Im on power limits. I've set [email protected] and tested with timespy, and power limit is keeping on and off. Clocks went from 2205 all the way down to 2025. Temps always below 60. I think is time for a uncapped bios for me.










is the bios on evga webpage a 450w bios?


----------



## Panchovix

This is my 3080 TUF, 2055Mhz at 0.975V but it goes to 2010Mhz coz power limit
















I scored 12 704 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





Wondering how much a shunt mod would help :C but shunting the PCI-E (coz Ampere) is way too risky imo


----------



## Imprezzion

Yeah I thought about it, I even bought the resistors from Mouser, but never soldered them on. Current GPU pricing, even tho it is going down it's still way way over MSRP, is too much to risk it for me right now.

My old 2x8 pin 2080 Ti had no problems with 515w after mod (220+220+75 PCI-E slot) and the PSU cables handled 220w per cable just fine.

I am pretty cautious with my current card tho. If I ever shunt one it won't be the Gigabyte card as it uses those weird flat non-standard PCI-E 8 pin connectors and that converter block. I no longer have the block on it as I have bought the ModDYI cables from PSU 8 pin - Gigabyte 8 pin directly and they work great but still, it's a weak flimsy connector with much smaller pin sizes then a PCI-E 8 pin normally has so...


----------



## mouacyk

Imprezzion said:


> Yeah I thought about it, I even bought the resistors from Mouser, but never soldered them on. Current GPU pricing, even tho it is going down it's still way way over MSRP, is too much to risk it for me right now.
> 
> My old 2x8 pin 2080 Ti had no problems with 515w after mod (220+220+75 PCI-E slot) and the PSU cables handled 220w per cable just fine.
> 
> I am pretty cautious with my current card tho. If I ever shunt one it won't be the Gigabyte card as it uses those weird flat non-standard PCI-E 8 pin connectors and that converter block. I no longer have the block on it as I have bought the ModDYI cables from PSU 8 pin - Gigabyte 8 pin directly and they work great but still, it's a weak flimsy connector with much smaller pin sizes then a PCI-E 8 pin normally has so...


I'm in exact same dilemma. Full cover Bykski block with their own extensions to the flat connectors, but the extensions have a much finer gauge as well versus the PCI-e PSU cable. Best example of planned obsolescence.


----------



## Imprezzion

Is it normal for the entire GPU to crash to the point all screens freeze, go black, the hardware removed sound can be heard, then the screens come back most of the time with the hardware connected sound. Event Viewer at that point is full of device not found errors and also normal criticals related to the obvious a.k.a. nvlddmkm.

I'm testing some clocks at very low voltages (0.906v @ 1920Mhz) and under load it's fine but as soon as I go to a game menu for example it can do the above sometimes.

I asked this because I still don't fully trust those 8 pin adapters and my RGB / fans are pretty shoddy wired and the PSU bay is so crammed full of wiring and splitters and controllers.. I have had issues with bad connections and shorts before so..


----------



## Falkentyne

Imprezzion said:


> Is it normal for the entire GPU to crash to the point all screens freeze, go black, the hardware removed sound can be heard, then the screens come back most of the time with the hardware connected sound. Event Viewer at that point is full of device not found errors and also normal criticals related to the obvious a.k.a. nvlddmkm.
> 
> I'm testing some clocks at very low voltages (0.906v @ 1920Mhz) and under load it's fine but as soon as I go to a game menu for example it can do the above sometimes.
> 
> I asked this because I still don't fully trust those 8 pin adapters and my RGB / fans are pretty shoddy wired and the PSU bay is so crammed full of wiring and splitters and controllers.. I have had issues with bad connections and shorts before so..


That, my friend, means you are unstable.
Either the GPU voltage is too low, you overclocked the memory too far or you pushed the core too high.
You get those errors because the driver is trying to poll the GPU when it's disconnected and trying to reset or crashed. If it's an easy recovery, you just get "display driver has crashed and successfully recovered" without the nvlddmkm errors.


----------



## fray_bentos

Imprezzion said:


> Is it normal for the entire GPU to crash to the point all screens freeze, go black, the hardware removed sound can be heard, then the screens come back most of the time with the hardware connected sound. Event Viewer at that point is full of device not found errors and also normal criticals related to the obvious a.k.a. nvlddmkm.
> 
> I'm testing some clocks at very low voltages (0.906v @ 1920Mhz) and under load it's fine but as soon as I go to a game menu for example it can do the above sometimes.
> 
> I asked this because I still don't fully trust those 8 pin adapters and my RGB / fans are pretty shoddy wired and the PSU bay is so crammed full of wiring and splitters and controllers.. I have had issues with bad connections and shorts before so..


Yes, a normal type of GPU crash. I can get this with clocks set too high for a given voltage.


----------



## Imprezzion

I'm used to it just throwing a DirectX error and closing the game lol. Oh well. I dropped down from 1965 @ 0.925 to 1920 @ 0.906 with custom curve to keep the effective clock high enough, effective sits around 1904 ish, and it ran fine for the rest of the evening and nice and cool and barely touched 320w peak so.


----------



## fray_bentos

Imprezzion said:


> I'm used to it just throwing a DirectX error and closing the game lol. Oh well. I dropped down from 1965 @ 0.925 to 1920 @ 0.906 with custom curve to keep the effective clock high enough, effective sits around 1904 ish, and it ran fine for the rest of the evening and nice and cool and barely touched 320w peak so.


I have an upper set limit of 1920 MHz @ 900 mV, so I have a very similar GPU to yours. Below are my optimised V/F points (3080FE), which you might find interesting/useful. When I say optimised, I mean 15 MHz higher at any of those voltages gave me crashes. In the last couple of days I used this table to set a fully customised V/F curve, while using power limit/temp limit to control the noise and hence voltage/frequency, (much like stock behaviour, but overclocked at all voltage points). It was an interesting experiment, and it made benchmark scores sky rocket. However, those numbers are fairly meaningless and artificial due to the shortness of the loads. The high clocks at 370 W load (and initially low temps) dropped to 270 W after a few minutes... Eventually, my set power/temp limits for quiet operation resulted in the GPU "naturally" settling on my normal "sweet spot" setting of 831 mV / 1785-1815 MHz (<65 C allows +15 MHz boost and <60 C allows +30 MHz boost from 1800 MHz set). Aside from this, I have been using the 900 mV capped bin as my "performance" setting, 800 mV cap for my "efficiency" setting, and the 775 mV cap for my "old game overkill" setting.


set V / mV​set freq / MHz​relative performance​peak power / W​"old game overkill" 775​1695 (+30 <60C)​0.88​230​"efficiency" 800​1740 (+30 <65C)​0.91​240​"sweet spot" 831​1785 (+15 <70 C)​0.95​270​"diminishing returns" 856​1830 (+15 <70 C)​0.95​290​"diminishing returns" 875​1860 (+15 <70 C​0.97​310​"diminishing returns" 881​1875 (-15 >75 C)​0.98​315​"performance" 900​1920 (-15 >75 C)​1.00​340​"noisy" 931​1965 (-15 >75 C)​1.02​350​"power limited" 962​1995 (-15 >75 C)​1.04​370 PL​"power limited" 1050​2040 (-15 >75 C / -30 >80 C)​1.06​370 PL​"power limited" 1075​2040 (-15 >75 C / -30 >80 C)​1.06​370 PL​


----------



## Sanju.ro

Hello!

First post on this forum, so please help me out with a bit of info I couldn't find anywhere.

I've got a 3080 Strix OC coming in and I want to know how should I connect the 8 pin pci-e cables, since my power supply only comes with two real connectors for the graphics card and I will be forced to daisy chain. It's a 750W Corsair TX750M, the newer version.

Is it better to daisy chain 8 pin connectors (from left to right) 1 and 2 / 1 and 3 / 2 and 3? Which connector should get the full treatment, as in non daisy chained for the optimal load balancing of power?

Thank you!


----------



## FedericoUY

Sanju.ro said:


> Hello!
> 
> First post on this forum, so please help me out with a bit of info I couldn't find anywhere.
> 
> I've got a 3080 Strix OC coming in and I want to know how should I connect the 8 pin pci-e cables, since my power supply only comes with two real connectors for the graphics card and I will be forced to daisy chain. It's a 750W Corsair TX750M, the newer version.
> 
> Is it better to daisy chain 8 pin connectors (from left to right) 1 and 2 / 1 and 3 / 2 and 3? Which connector should get the full treatment, as in non daisy chained for the optimal load balancing of power?
> 
> Thank you!


Hello. I wouldn't care much of the position as that PSU is a single rail 12v. I think as far as you keep the card in the wattage of normal usage, it will be fine. You will be pulling at max 320w on load (I think) via 2 cables and pci-e connector itself. If you will attempt to do major overclocks and/or bios switching to a more powerful one, I would change the PSU to one that has also one single rail 12v, but with more 8pin connectors for better power.


----------



## Imprezzion

That card can do more like 380w but it's fine. Per connector load is barely 120w if balancing is done right.


----------



## Perelom4eshui

Hi! Has anyone put solid copper plates on their 3080 card memory? How effective are they?


----------



## sdmf74

Whats the best waterblock for the EVGA rtx 3080 ftw ultra and why? Im not even sure if this gpu can use the active backplate. Any info you guys can give me about FTW ULTRA waterblocks would be great???


----------



## FUZZFrrek

I've just tried the 450w (+118%) BIOS for RTX 3080 EVGA FTW3 Ultra and my card will never exceed 405w on full load, either playing games (SotTR, ACV, CP 2077) or benchmarking (3DMark Port Royale, Superposition 1080 Extreme, etc). Is there a way to push the card far enough to let it go all the way to 450w?

OC Are +180 between 1.000v and 1.056v (custom curve in AB), +1700Mhz on memory and stable. Temperatures are below 65C, 100% fan speed on stock cooler. When I look in GPU-Z, I PerfCap Pwr... Any help would be appreciated!!


----------



## Perelom4eshui

FUZZFrrek said:


> I've just tried the 450w (+118%) BIOS for RTX 3080 EVGA FTW3 Ultra and my card will never exceed 405w on full load, either playing games (SotTR, ACV, CP 2077) or benchmarking (3DMark Port Royale, Superposition 1080 Extreme, etc). Is there a way to push the card far enough to let it go all the way to 450w?
> 
> OC Are +180 between 1.000v and 1.056v (custom curve in AB), +1700Mhz on memory and stable. Temperatures are below 65C, 100% fan speed on stock cooler. When I look in GPU-Z, I PerfCap Pwr... Any help would be appreciated!!


Can you show your GPU-Z screenshots under high load ?


----------



## FUZZFrrek

Perelom4eshui said:


> Can you show your GPU-Z screenshots under high load ?


----------



## Perelom4eshui

FUZZFrrek said:


> View attachment 2518468


It looks very strange. Not only is there a power limit, there's also a temperature limit. Although the sensors show very small values. Maybe you have a conflict of some overclocking programs? I suggest to remove EVGA utility and use only Afterburner.


----------



## FUZZFrrek

Perelom4eshui said:


> It looks very strange. Not only is there a power limit, there's also a temperature limit. Although the sensors show very small values. Maybe you have a conflict of some overclocking programs? I suggest to remove EVGA utility and use only Afterburner.


I use MSI AB. Power limit is my may concern.


----------



## FedericoUY

Isn't this the same problem we are all (newer 3080 ultras) having with connector number 3 on evgas? There is a dedicated thread on evga forums..
EDIT: I've seen you asked there too, if your user is 'Celtcry', anyway, it is happening to majority of new FTW3 Ultras.


----------



## SoldierRBT

A little bit better than before. 









I scored 13 636 in Port Royal


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## FedericoUY

SoldierRBT said:


> A little bit better than before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 13 636 in Port Royal
> 
> 
> Intel Core i9-10900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


You definetly has the better 3080 i've seen. Mine will not balance power correctly (as it is a known issue from EVGA), so even while it behaves very nice with low volts (2025 to 2040 @ 925mv, 2100+ @1000mv, etc), it will start powerlimiting when set to 2130+. It's a real shame, because chip looks great, but your's it's awesome, besides not having that problem... Congrats man!


----------



## SoldierRBT

FedericoUY said:


> You definetly has the better 3080 i've seen. Mine will not balance power correctly (as it is a known issue from EVGA), so even while it behaves very nice with low volts (2025 to 2040 @ 925mv, 2100+ @1000mv, etc), it will start powerlimiting when set to 2130+. It's a real shame, because chip looks great, but your's it's awesome, besides not having that problem... Congrats man!


Thank you. Could you run RTX Quake 2 for a few minutes in your card and share GPU-Z values? Make sure power slider is set to max.


----------



## acoustic

Yeah you might have one of the best 3080 silicon I've ever seen. That card is an absolute monster.


----------



## SoldierRBT

acoustic said:


> Yeah you might have one of the best 3080 silicon I've ever seen. That card is an absolute monster.


Thank you. What I do is set 1.056v locked 2265MHz (+240) then I run nvidia-smi to 2220MHz. That way it will run 2205-2220MHz the entire time in PR. I've tried 2235MHz but it crashes. Memory is only capable of +1200. I could make it run +1220 once but the second time score drops. I'll see If I can get a waterblock to improve memory and core. My daily card is a 3090 KPE.


----------



## FedericoUY

SoldierRBT said:


> Thank you. Could you run RTX Quake 2 for a few minutes in your card and share GPU-Z values? Make sure power slider is set to max.


Ill try to get that game and ill do it.

EDIT:
@SoldierRBT ran the tests you asked:

1st test [email protected] +500vram:









2nd test [email protected] +500vram (check the powerlimit, all the time showing as limiting):









In this second run, as power limit was triggering almost all the time, the clocks would drop even to sub 2000. It must be a desing flaw.
On a side note, I'm testing windows 11, so this was run under this OS. Win10 does the same.


----------



## sdmf74

Any cablemod pro sleeved psu kit owners here with a 3080 FTW or a gpu that requires 3x 8 pin Pcie cables? I believe the cablemod pro kits only come with 2x 8 pins and 1x 6 pin pcie cables.
Did you guys just have to order another 8 pin cable from cablemod? 
(EDIT: I ordered one from amazon)

Another question if you dont mind. Does anyone know if the plastic endpiece on the EK quantum vector FTW3 rtx 3080 waterblock is removable? It looks like it should be. Im not sure if I have enough room because of where my reservoir is mounted & Im wondering how the waterblock would look with the end piece removed


----------



## SoldierRBT

FedericoUY said:


> Ill try to get that game and ill do it.
> 
> EDIT:
> @SoldierRBT ran the tests you asked:
> 
> 1st test [email protected] +500vram:
> View attachment 2518614
> 
> 
> 2nd test [email protected] +500vram (check the powerlimit, all the time showing as limiting):
> View attachment 2518615
> 
> 
> In this second run, as power limit was triggering almost all the time, the clocks would drop even to sub 2000. It must be a desing flaw.
> On a side note, I'm testing windows 11, so this was run under this OS. Win10 does the same.


RTX Quake 2 is free on Steam. 

There's definitely something wrong. I've been reading the EVGA forums and a lot of cards are having the same issue (3080 Ti too). The 3rd 8pin connector is basically drawing half of what the other 2x 8pins are drawing. That's why it can't reach 450W. Some people say the card was first design with 2x 8pin and a 3rd 6pin connector but was changed to 3x 8pin at the last minute to compete with the other brands. I'd say if it bothers you, try to RMA the card.


----------



## acoustic

SoldierRBT said:


> RTX Quake 2 is free on Steam.
> 
> There's definitely something wrong. I've been reading the EVGA forums and a lot of cards are having the same issue (3080 Ti too). The 3rd 8pin connector is basically drawing half of what the other 2x 8pins are drawing. That's why it can't reach 450W. Some people say the card was first design with 2x 8pin and a 3rd 6pin connector but was changed to 3x 8pin at the last minute to compete with the other brands. I'd say if it bothers you, try to RMA the card.


There was a guy with a 3080TI FTW3 that actually used a clamp meter and it turns out that the 2nd 8pin is actually drawing 20-30w more than what software reading is showing, which is why some are only seeing 410-420watts max. The card is actually pulling 450watt, but the software isn't reading it correctly. It's in the 3080TI thread, if I remember correctly. I don't really understand why some cards are reading correctly and others aren't - my 3080 FTW3 had no problem hitting 450watt, and seeing transient spikes up into the 480watt range. IMO, there's something very wrong with the design and/or the components. There's no explanation as to why some cards read correctly, and others aren't. I wouldn't be surprised if some cards actually aren't pulling that 450watt with all the craziness surrounding the FTW3 this generation.

The scary part about the 2nd 8Pin, though, is for those with 3080/3080TI/3090 FTW3, are actually very close to popping the 20a fuse on that 2nd 8pin, especially with the 500w+ BIOS on the 3090 FTW3.


----------



## FedericoUY

SoldierRBT said:


> RTX Quake 2 is free on Steam.
> 
> There's definitely something wrong. I've been reading the EVGA forums and a lot of cards are having the same issue (3080 Ti too). The 3rd 8pin connector is basically drawing half of what the other 2x 8pins are drawing. That's why it can't reach 450W. Some people say the card was first design with 2x 8pin and a 3rd 6pin connector but was changed to 3x 8pin at the last minute to compete with the other brands. I'd say if it bothers you, try to RMA the card.


That's right, it's a known issue that EVGA still has not came to explain why it is, but it is possible that as the fist batch were having problems with the 3rd connector (red light of death), they changed something and now for this batch the connector would draw half the power (as a 6pin connector). EVGA forums is full of complaints about this, and it really pisses me off that my card has this problem. RMA for me is complicated since I do not live in USA, and besides, it is possible that they send me a equal card. At least this one can game 2040 at really low voltages...

BTW, that reading I did was with RTX quake 2, I downloaded and tested as you requested. It really make the card full load and full wattage. Nice finding!


----------



## FedericoUY

acoustic said:


> There was a guy with a 3080TI FTW3 that actually used a clamp meter and it turns out that the 2nd 8pin is actually drawing 20-30w more than what software reading is showing, which is why some are only seeing 410-420watts max. The card is actually pulling 450watt, but the software isn't reading it correctly. It's in the 3080TI thread, if I remember correctly. I don't really understand why some cards are reading correctly and others aren't - my 3080 FTW3 had no problem hitting 450watt, and seeing transient spikes up into the 480watt range. IMO, there's something very wrong with the design and/or the components. There's no explanation as to why some cards read correctly, and others aren't. I wouldn't be surprised if some cards actually aren't pulling that 450watt with all the craziness surrounding the FTW3 this generation.
> 
> The scary part about the 2nd 8Pin, though, is for those with 3080/3080TI/3090 FTW3, are actually very close to popping the 20a fuse on that 2nd 8pin, especially with the 500w+ BIOS on the 3090 FTW3.


Yep, this FTW3 revision is very wrong, as you say, 2nd connector in my card at least, is pulling sometimes more than spec'd 150w, it may eventually blow up something, just for not being able to power balance correctly. If any electronic guru came out and tell if any component could be changed to do this as it should, I would try it .
I'd also like to take some pictures to my card and any other same card without this issue, to see if at naked eye, can be checked any different component (like a shunt or something else).

EDIT: Well, in the techpowerup web can be seen at least the fist revision of the card. Anyone has a faulty 3rd pin pic to share?








EVGA GeForce RTX 3080 FTW3 Ultra Review


The EVGA RTX 3080 FTW3 Ultra is extremely impressive. It is the fastest RTX 3080 we've reviewed so far, thanks to a massively increased power limit. The new iCX cooler works well, reaching amazing noise levels that are better than most other RTX 3080 cards, and fan-stop is included, too.




www.techpowerup.com


----------



## Sanju.ro

FedericoUY said:


> Hello. I wouldn't care much of the position as that PSU is a single rail 12v. I think as far as you keep the card in the wattage of normal usage, it will be fine. You will be pulling at max 320w on load (I think) via 2 cables and pci-e connector itself. If you will attempt to do major overclocks and/or bios switching to a more powerful one, I would change the PSU to one that has also one single rail 12v, but with more 8pin connectors for better power.





Imprezzion said:


> That card can do more like 380w but it's fine. Per connector load is barely 120w if balancing is done right.


Hey, guys! I'm back after some testing and here are my findings regarding RTX 3080 Strix OC White non-LHR:

1. stock power limit is 370W (in reality it's closer to 380 W) and max power limit is 450 W - the slider goes to 121%;
2. max power draw from the PCIE slot I've seen is 38W;
3. it's pulling more power from the second 8 pin connector (the one in the middle either way you look), like 153W.

In conclusion: I'm daisy chaining connectors 1 and 3 and connector 2 is using a full 8 pin pci-e cable by itself.

Even though Asus recommends a 850W power supply, so far I've not encountered any issues with my 750W Corsair TX750M. Could be because my CPU is the low powered Ryzen 3700X non overclocked, but I'm happy.

The card gets pretty hot in my Phanteks P600S (front and top panel removed) with 5 fans total, 3 intake and 2 exhaust, up to 78C for the core and 91 for the GPU hotspot. VRAM gets to 88C. My suspicion is that I'm wasting a lot of fresh air below the power supply shroud and not enough gets to card itself. Maybe I'll look into drilling holes in the shroud of the case or remove it entirely, I'm not sure yet.


----------



## sdmf74

So it's this issue with all FTW3's? I've got a 3080 ftw3 and ek waterblock on the way and I was not aware of this problem. 
Are any YouTubers looking into it? IE buildzoid or anyone.
It's too bad EVGA is not telling us what's going on?


----------



## FedericoUY

sdmf74 said:


> So it's this issue with all FTW3's? I've got a 3080 ftw3 and ek waterblock on the way and I was not aware of this problem.
> Are any YouTubers looking into it? IE buildzoid or anyone.
> It's too bad EVGA is not telling us what's going on?


Yes all people under this situation is kinda pissed off, including me. If your card is NIB, it's probable that will have this issue, but who knows... EVGA forums has a lot on this, but no one in EVGA is telling anything. I would really love someone with high electric skills let us all know what is going on with unbalanced cards...
Good luck!


----------



## vaeron

3D Mark results for my latest upgrade:

What I'm coming from: Core i7-7820X / MSI GTX 1070 Ti - Time Spy: I scored 6 327 in Time Spy
Upgrades: Core i9-10980xe / ASUS TUF OC RTX 3080
Time Spy: I scored 15 964 in Time Spy
Port Royal: I scored 11 294 in Port Royal

I'm quite happy with the upgrade, next up is the liquid loop that's been sitting waiting for nearly a year. Hoping to have it done by this weekend.


----------



## mouacyk

vaeron said:


> 3D Mark results for my latest upgrade:
> 
> What I'm coming from: Core i7-7820X / MSI GTX 1070 Ti - Time Spy: I scored 6 327 in Time Spy
> Upgrades: Core i9-10980xe / ASUS TUF OC RTX 3080
> Time Spy: I scored 15 964 in Time Spy
> Port Royal: I scored 11 294 in Port Royal
> 
> I'm quite happy with the upgrade, next up is the liquid loop that's been sitting waiting for nearly a year. Hoping to have it done by this weekend.


Btw, in TimeSpy you scored 17 195.


----------



## vaeron

mouacyk said:


> Btw, in TimeSpy you scored 17 195.


Good call, that was my overall though which was what I was comparing as I upgraded both the CPU and GPU. Appreciate you pointing that out.


----------



## Castaile

KShirza1 said:


> Popped in my old 3080 and bypassed my loop in my main pc while I water cool my 3080 Ti. Forgot to join this group last year!
> 
> Build log
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3080 powering my c9 in my game room rig


Attention seeking much? Flexing your 3080 and 3080ti. 
Perhaps post some overclocking results as well?


----------



## ssgwright

not to shabby for a TUF, port royal: 13,023









I scored 13 023 in Port Royal


Intel Core i9-10850K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## FedericoUY

Very nice score! Is the TUF a 3 8 pin powered board?


----------



## ssgwright

FedericoUY said:


> Very nice score! Is the TUF a 3 8 pin powered board?


no its a two


----------



## Nizzen

Castaile said:


> Attention seeking much? Flexing your 3080 and 3080ti.
> Perhaps post some overclocking results as well?


Flex? 

















😘


----------



## SgtRotty

Hello! I'm running a rtx3080 gigabyte master (rev.1) 320w bios I can only get +400 memory / core clocks at 1845-1860 .881v. Its 100% stable with power peaking up to 300 watts no throttling. Is there a bios thats confirmed will work on this card for more power?


----------



## Imprezzion

SgtRotty said:


> Hello! I'm running a rtx3080 gigabyte master (rev.1) 320w bios I can only get +400 memory / core clocks at 1845-1860 .881v. Its 100% stable with power peaking up to 300 watts no throttling. Is there a bios thats confirmed will work on this card for more power?


Rev 1 is dual 8 pin so really, no. EVGA XC3 BIOS allows you to go up to 366w (I run it on my Gigabyte Gaming OC rev 1 2x8pin) which is effective more like 345-350w max before it throttles but it should be able to at least hold 0.956v ish. I run 1950-1965 @ 0.925 on mine and rarely go over 320w and it never throttles. I can go up to 1995 @ 0.956 if I want to without throttling.

Memory on my Gaming OC does +1400 effortless tho. It only hits ECC above +1550 but to be on the safe side I keep it at +1400.


----------



## SgtRotty

Imprezzion said:


> Rev 1 is dual 8 pin so really, no. EVGA XC3 BIOS allows you to go up to 366w (I run it on my Gigabyte Gaming OC rev 1 2x8pin) which is effective more like 345-350w max before it throttles but it should be able to at least hold 0.956v ish. I run 1950-1965 @ 0.925 on mine and rarely go over 320w and it never throttles. I can go up to 1995 @ 0.956 if I want to without throttling.
> 
> Memory on my Gaming OC does +1400 effortless tho. It only hits ECC above +1550 but to be on the safe side I keep it at +1400.


Does running that bios cause any problem with display port on the card? I'm only running 1 monitor.. Thanks for the quick reply


----------



## Imprezzion

SgtRotty said:


> Does running that bios cause any problem with display port on the card? I'm only running 1 monitor.. Thanks for the quick reply


Not on my Gaming OC but it isn't the same PCB so not sure. I use 3 monitors, 1 DP 2 HDMI and all ports work fine.


----------



## SgtRotty

Imprezzion said:


> Not on my Gaming OC but it isn't the same PCB so not sure. I use 3 monitors, 1 DP 2 HDMI and all ports work fine.


So I went back and tried your settings to see what happens. I'm now at 0.925v at 1965, and it seems to show 327watts now but no downclocking throttling. I assumed when I reached 320w there was a throttle somewhere. Hwinfo64 says no on the performance limit power section.If yours throttles around 350, I should have a lil headroom still! This is on my original bios, thanks again!


----------



## Imprezzion

SgtRotty said:


> So I went back and tried your settings to see what happens. I'm now at 0.925v at 1965, and it seems to show 327watts now but no downclocking throttling. I assumed when I reached 320w there was a throttle somewhere. Hwinfo64 says no on the performance limit power section.If yours throttles around 350, I should have a lil headroom still! This is on my original bios, thanks again!


Stock Gigabyte BIOS throttles at 340w. So you should be fine there.


----------



## SgtRotty

Imprezzion said:


> Stock Gigabyte BIOS throttles at 340w. So you should be fine there.


----------



## Imprezzion

Btw guys, I was wondering, is there any real benefit of running these cards at a very low temperature besides obviously longevity? I mean, the RTX2xxx would drop clock bins every so many degrees even with custom curve but my 3080 doesn't seem to do that at all. No matter the temp it just runs what I set and that's that.

I used to have my radiator fans quite loud under load with the fancurve at about ~75% 1580 RPM and that keeps the card @ 2010 1.018v at around 330w load (~1985 effective clock) around 51-53c core with 68-70c hotspot. I dropped the fan speed a considerable amount to the point it's very quiet now at ~52% 1140RPM fan speed which makes it run 60-62c core and 81-83c hotspot. To me this looks totally acceptable except the large delta between core and hotspot but I need to remount anyway when I swap the thermal pads for the VRAM so.. If it's just as stable at the same clocks can it "hurt" to run those temps?


----------



## SgtRotty

Imprezzion said:


> Btw guys, I was wondering, is there any real benefit of running these cards at a very low temperature besides obviously longevity? I mean, the RTX2xxx would drop clock bins every so many degrees even with custom curve but my 3080 doesn't seem to do that at all. No matter the temp it just runs what I set and that's that.
> 
> I used to have my radiator fans quite loud under load with the fancurve at about ~75% 1580 RPM and that keeps the card @ 2010 1.018v at around 330w load (~1985 effective clock) around 51-53c core with 68-70c hotspot. I dropped the fan speed a considerable amount to the point it's very quiet now at ~52% 1140RPM fan speed which makes it run 60-62c core and 81-83c hotspot. To me this looks totally acceptable except the large delta between core and hotspot but I need to remount anyway when I swap the thermal pads for the VRAM so.. If it's just as stable at the same clocks can it "hurt" to run those temps?


It might be acceptable temps for some. Me personally im getting close to the same temps. Memory junction is going up to 80c+ on mine so im gonna swap the thermal pads to see if it helps. I'll try and post before and after results. Anything over 80c I get nervous


----------



## Imprezzion

SgtRotty said:


> It might be acceptable temps for some. Me personally im getting close to the same temps. Memory junction is going up to 80c+ on mine so im gonna swap the thermal pads to see if it helps. I'll try and post before and after results. Anything over 80c I get nervous


My memory junction sits at 78-80c as well but that isn't because of airflow. They are cooled by the same waterblock on the EVGA FTW3 Hybrid coolers. There's a copper piece you screw onto the block that does the VRAM but the thermal pads are just as bad as all the other brands so I need better ones. That would bring VRAM Junction down to the 50's.

It's more like, 83c hotspot sounds high. Obviously every single air-cooled card on stock fancurve runs way hotter but yeah..


----------



## Pro4TLZZ

Hello World!

This my best score with EVGA FTW3 ULTRA GAMING 3080
*13 157*

https://www.3dmark.com/pr/1171664








3DMark.com search


3DMark.com search




www.3dmark.com





I am 7th in the UK for RTX 3080 behind @Clukos


----------



## RobertoSampaio

Hi...

I have a gigabyte rtx3080 and sometimes, after a long time playing, the image start to flicker.
I exit the game and run kombustor and windows desktop flickers too... 
Sometimes I restart the window and run kombustor again and the problem persists, sometimes the problem is solved.
Temperatures are ok...
What could be?
Any guess?


----------



## ssgwright

hmm, maybe try a fresh driver install (custom, uninstall/reinstall)


----------



## Imprezzion

RobertoSampaio said:


> Hi...
> 
> I have a gigabyte rtx3080 and sometimes, after a long time playing, the image start to flicker.
> I exit the game and run kombustor and windows desktop flickers too...
> Sometimes I restart the window and run kombustor again and the problem persists, sometimes the problem is solved.
> Temperatures are ok...
> What could be?
> Any guess?


Are you using G-Sync by any chance?


----------



## RobertoSampaio

Imprezzion said:


> Are you using G-Sync by any chance?


Yes, I'm... I'll try to change this when it happens again.


----------



## MikeS3000

I have flickering on my Gigabyte RTX 3080 and it went away when I changed to a different displayport output on the card. One of my outputs causes flickering and the others seem fine. Have you tried that?


----------



## RobertoSampaio

MikeS3000 said:


> I have flickering on my Gigabyte RTX 3080 and it went away when I changed to a different displayport output on the card. One of my outputs causes flickering and the others seem fine. Have you tried that?


But you had the flickering all the time or sometimes?

I have this problem 1 or 2 times per month

I'll change the output now.


----------



## Imprezzion

RobertoSampaio said:


> But you had the flickering all the time or sometimes?
> 
> I have this problem 1 or 2 times per month
> 
> I'll change the output now.


My Gigabyte has done this once or twice as well and it was related to G-Sync. My monitor is not officially supported even tho it does work fine but it sometimes flickers a little with lower frame rates and that stays on the desktop after that until I reboot. I also changed the DP port and switched to only G-Sync on full screen and not windowed mode and now it's gone and works fine.


----------



## RobertoSampaio

So far is working fine... 
But I need to test for a month...


----------



## MikeS3000

Nice, curious to see if the same solution worked for you.


----------



## christizzz

Hello,

i undervolted my Asus TUF 3080 OC (V2 LHR) wot the following parameters: 1860mhz with 0.862 Volts... i am getting better Results in Time spy and Port royal than the stock config.
I think it's because i am averaging a higher clock ( 1860mhz ) instead of the 1825Mhz in stock config in time spy...

without Undervolting ( stock config) the max boost clocks i saw was 1920-1935 Mhz ( which i found weird when compared to results from the V1 cards wheere i saw boost clocks above 1950 with higher temps...). My card is not underperforming i can see that... but i dont understand why the max boost clocks are kinda lower... my temps without undervolting were around 65degrees...

I am suspecting the VBIOS of the newer V2 cards...when checking the voltage curves i am seeing at 1.025-1.075 volts a clock of 1920-1935... and the card is not getting above those voltages in stock... even when increasing the power slider to 110%...

did they change something for the V2 cards ?


----------



## kovyrshin

Hi all. 
Got 3080 Aorus Master today, LHR version that comes with 3x 8PIN. Does Gigabyte have bios with lifted PL like EVGA XOC Bios by any chance? or 3-pin card have same limit as 2-pin?


----------



## Nizzen

kovyrshin said:


> Hi all.
> Got 3080 Aorus Master today, LHR version that comes with 3x 8PIN. Does Gigabyte have bios with lifted PL like EVGA XOC Bios by any chance? or 3-pin card have same limit as 2-pin?


There is no 3080 xoc bios. Shuntmod it, to unlock more power, or use 3080bios like Strix, to get a bit more power


----------



## RobertoSampaio

Which is your Afterburn CPU priority?
Mine is running below normal.... Is it right?


----------



## Astral85

I heard that Asus had a "450W BIOS". When I brought my Strix 3080 OC recently I was pulling up to 422W in 3D Mark on the stock BIOS. I decided to flash this Asus "450W BIOS" to my second BIOS anyway. I got worse spec readings in GPU-Z on the "Asus" BIOS than on the stock BIOS and the "Asus BIOS" didn't have re-bar support in it. Does anyone know anything about this "Asus 450W BIOS"? 

This is the original article I found: Asus new BIOS increase power limit (450W) of Strix / TUF GeForce RTX 3090 / 3080


----------



## kovyrshin

Nizzen said:


> There is no 3080 xoc bios. Shuntmod it, to unlock more power, or use 3080bios like Strix, to get a bit more power


Are those (Strix vs Aorus Master) compatible between each other? 
Can anyone point me to Strix LHR BIOS with 450W limit?


----------



## delreylover

So i was out of the loop for a while. Any updates on increasing power limits on Palit GamingPro 3080?


----------



## dk10438

was selected in the Newegg shuffle for a EVGA 3080 FTW. Here's my question.
I'm aware of the VRAM overheating issues when mining. However, I plan on using this card in my gaming computer and don't plan on mining with it. Are these cards still overheating/thermal throttling just with gaming??


----------



## Imprezzion

dk10438 said:


> was selected in the Newegg shuffle for a EVGA 3080 FTW. Here's my question.
> I'm aware of the VRAM overheating issues when mining. However, I plan on using this card in my gaming computer and don't plan on mining with it. Are these cards still overheating/thermal throttling just with gaming??


Not really and the FTW3 is one of the better ones that doesn't really have the same problems to that extent. A stock one should stay in the 80's.


----------



## dk10438

Imprezzion said:


> Not really and the FTW3 is one of the better ones that doesn't really have the same problems to that extent. A stock one should stay in the 80's.


thanks.


----------



## Sleepycat

dk10438 said:


> was selected in the Newegg shuffle for a EVGA 3080 FTW. Here's my question.
> I'm aware of the VRAM overheating issues when mining. However, I plan on using this card in my gaming computer and don't plan on mining with it. Are these cards still overheating/thermal throttling just with gaming??


Mine is the 3080 XC3 with the smaller cooler. When testing using Nicehash lite, the memory temperature stays below 98 ºC in an enclosed case with just 1 intake and 1 exhaust fan. No problems with gaming as the VRAM temperature is much lower for me. You might see a difference if you run 4K Ultra on demanding titles. The bigger issue is probably the heat generated by the GPU is dumped into the case, which then has to cool your CPU as well. So I would improve on the case intake and exhaust.


----------



## dk10438

Just received the card today so I’ll install and test. It’ll be going into a lian li 011 with 6 intake fans and 3 exhaust…

just installed it. With very light testing VRAM temps are staying below 82.


----------



## famich

Hello, anyone - I got a new Gainward RTX 3080 Phantom - a very nice card, Port Royal 180+MHZ stable, Cyberpunk a bit less, of course . Has anyone got a 440W BIOS from Phantom GS and a matching NVFLASh ? I cannot seem to be able to flash anything on this cards EEPROm apart from doing the backup, of course. The newest Nvflash ist most probably not patched, so, no luck in changing anything. Am I right ,? Thanks for any advice.


----------



## Imprezzion

famich said:


> Hello, anyone - I got a new Gainward RTX 3080 Phantom - a very nice card, Port Royal 180+MHZ stable, Cyberpunk a bit less, of course . Has anyone got a 440W BIOS from Phantom GS and a matching NVFLASh ? I cannot seem to be able to flash anything on this cards EEPROm apart from doing the backup, of course. The newest Nvflash ist most probably not patched, so, no luck in changing anything. Am I right ,? Thanks for any advice.











Gainward RTX 3080 VBIOS


10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory




www.techpowerup.com





Only flash if your non GS has 3x8 pin. If 2x8 pin do not bother, it'll reduce power limits.


----------



## famich

Imprezzion said:


> Gainward RTX 3080 VBIOS
> 
> 
> 10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Only flash if your non GS has 3x8 pin. If 2x8 pin do not bother, it'll reduce power limits.


Many thanks, I might need the newest NVFLASH for it, as the one I got cannot overcome ID mismatch. Sadly , the -6 parameter no longer works..
Phantom and Phantom GS are the same, 3x8PIN, custom PCB. YOu only pay for the better BIOS with GS.. I doubt that GW would go an extra mile to bin the GS chips.


----------



## Imprezzion

famich said:


> Many thanks, I might need the newest NVFLASH for it, as the one I got cannot overcome ID mismatch. Sadly , the -6 parameter no longer works..
> Phantom and Phantom GS are the same, 3x8PIN, custom PCB. YOu only pay for the better BIOS with GS.. I doubt that GW would go an extra mile to bin the GS chips.


They claimed they did on the older GTX1xxx cards. I had a 2080 Ti GS but no idea if it was pre-binned. It was a €500 eBay special of a waterblock installation gone wrong so came as bare PCB with no stock cooler lol. Still runs fine too after almost 2 years with a Accelero..

I'm still looking at trading my 2x8pin 3080 for a 3 pin model (or a Ti) but the market is still scuffed here.


----------



## famich

3PIN 3080/3090 is a must ., I used to have EVGA RTX 3090 ACX HYBrid, it was practically useless, always running in PWTGT.


----------



## famich

Need that NVFLASH though


----------



## Falkentyne

famich said:


> Many thanks, I might need the newest NVFLASH for it, as the one I got cannot overcome ID mismatch. Sadly , the -6 parameter no longer works..
> Phantom and Phantom GS are the same, 3x8PIN, custom PCB. YOu only pay for the better BIOS with GS.. I doubt that GW would go an extra mile to bin the GS chips.


Use this one at your own risk. (do not use -6).






nvflash.exe







drive.google.com


----------



## acoustic

Falkentyne said:


> Use this one at your own risk. (do not use -6).
> 
> 
> 
> 
> 
> 
> nvflash.exe
> 
> 
> 
> 
> 
> 
> 
> drive.google.com


Ol' Falkentyne coming in hot with the goods!


----------



## famich

Falkentyne said:


> Use this one at your own risk. (do not use -6).
> 
> 
> 
> 
> 
> 
> nvflash.exe
> 
> 
> 
> 
> 
> 
> 
> drive.google.com


Thanks —))


----------



## Falkentyne

famich said:


> Thanks —))


Assuming it worked?
(there are two bytes patched in there by someone who said they modded it for cross flashing laptop Ampere RTX for a higher power limit, which worked for them). I did a file compare (fc/b) with the original 2.670.0.
No one has been brave enough to try crossflashing a Founder's edition with an AIB or 1KW Bios however! One person tried crossflashing his Strix (3090) with the FE Bios and got a VGA Load Bios Post code error on his motherboard and had to reflash by booting to the backup bios.


----------



## famich

Falkentyne said:


> Assuming it worked?
> (there are two bytes patched in there by someone who said they modded it for cross flashing laptop Ampere RTX for a higher power limit, which worked for them). I did a file compare (fc/b) with the original 2.670.0.
> No one has been brave enough to try crossflashing a Founder's edition with an AIB or 1KW Bios however! One person tried crossflashing his Strix (3090) with the FE Bios and got a VGA Load Bios Post code error on his motherboard and had to reflash by booting to the backup bios.


I will give it a try.. It is a pity that good old times with an easy going flashing are gone Phantonm and Phantom GS : the BIOS difference is 400-440W power target.


----------



## famich

famich said:


> I will give it a try.. It is a pity that good old times with an easy going flashing are gone Phantonm and Phantom GS : the BIOS difference is 400-440W power target.


Sadly,it didn’t work.I have done it many times before. Don’t know what is the problem.


----------



## Imprezzion

I mean, I flashed my Gigabyte Gaming OC with a EVGA XC3 BIOS just fine. Don't know exactly which nvflash I used but I can upload it. Gimme a sec.


----------



## mouacyk

-6 is enough


----------



## Falkentyne

famich said:


> Sadly,it didn’t work.I have done it many times before. Don’t know what is the problem.


What happened? 
Did you use -6?
Do not use -6 with that flasher. It won't say PCI mismatch.


----------



## famich

Falkentyne said:


> What happened?
> Did you use -6?
> Do not use -6 with that flasher. It won't say PCI mismatch.


Nope,just the usual: run here the command prompt
after that — save backup.rom , that was OK.
Renamed GSBIOS as VBIOS.rom ,
Nvflash VBIOS.rom 
After that just for a sec the driver went paused and right away back- nothing happened.
Tried even —protectoff command, did not help.
GW might have some extra lock or I must be doing something wrong. Thanks


----------



## famich

Got it , ID mismatch see the attached image - it looks that this version of NVFLASH is not patched at all


----------



## Falkentyne

It's only patched for certain device ID's. It was made for laptop crossflashing. Several users were able to flash a higher TDP bios with it, where all of the unpatched versions gave device mismatch, even with -6. I don't know if it works on desktops as I can't even use it. I know it works at least for some 3090 cards (as a Strix user was able to "Flash" a 3090 FE Bios on it, but then he got a post code motherboard error "Load VGA Bios". since apparently the power delivery is incompatible or something.


----------



## famich

yes, you are right- no big deal here and I do thank you for your support
maybe someone somewhere will have another version-


----------



## Pro4TLZZ

I took top spot in the UK for 3080 on port royal.
Score 13364 








I scored 13 364 in Port Royal


Intel Core i7-8700K Processor, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com












3DMark.com search


3DMark.com search




www.3dmark.com






Sorry @Clokus


----------



## dk10438

just started on OC'ing my 3080 FTW. Trying to get 19000 on the graphics score but the highest I can get is 18975. So close....


----------



## Imprezzion

Got bored and decided to try some BIOS from TechPowerUp Database again just to see if any of them for a 2x8pin would behave any different on my Gigabyte Gaming OC Rev 1.0.

Conclusion, yes they do. 


Most of them are limited to 340w and aren't really of any use.
TUF BIOS does not properly work on a non-TUF card.
Gigabyte stock updated ReBAR BIOS is quite good and allows it to run pretty well and around 355w before it really starts to throttle.
EVGA XC3 (Ultra) BIOS works great on any 2x8pin card. Has 366w limit, effective 345-350w, but runs very well and somehow displays a lower power draw then other BIOS. 273w vs 301w in the exact same static scene with the same effective clock and memory clock and voltage. (1950 @ 0.925 +1400 memory). So it might help.

- Aorus Master (R1.0 2x8pin model) BIOS. Yeah, I did not see that one coming.. one DP doesn't work but that's about it. It throttles at the same point all of them do, around 345-355w, but.. where the stock Gaming OC BIOS maxes out at like 1980 @ 1.006v on stock / 0 offset core at best before throttling, the Aorus Master BIOS runs a whopping 2025 @ 1.062v without even hitting the limiter at stock / 0 offset core. Only 330-332w drawn. FPS is measurably higher in the same static scene as well. And temps are about 3c lower for both core and hotspot. Even at much higher clocks and voltage. So, I let it stretch it's legs a bit, not using curve, just offset, and it settled at 2070 @ 1.043 ish and still isn't hitting the limiter. It barely throttles at all. Usually jumps between 2085 @ 1.056 and 2070 @ 1.043 and just stays there. 338-344w. Massive improvement. Effective clocks are also way above 2050 so it's not like it's effective is running super low like the TUF BIOS does.


----------



## D13mass

Hey, guys, I have 3080 Aorus Xtreme with maximum 450W power limit, do we have something like XOC bios power limitless for 3080?
This is my card and my current bios Gigabyte RTX 3080 VBIOS 
PS. Yes, I have watercooling custom loop and many radiators.


----------



## Imprezzion

D13mass said:


> Hey, guys, I have 3080 Aorus Xtreme with maximum 450W power limit, do we have something like XOC bios power limitless for 3080?
> This is my card and my current bios Gigabyte RTX 3080 VBIOS
> PS. Yes, I have watercooling custom loop and many radiators.


Nope 450w is all you get. Even EVGA XOC BIOS is 450w max. Then again, max voltage is still 1.100v and I can't think of any load or core clock that will exceed 450w all that easily. 

I can now, on the Aorus Master 2x8 pin BIOS hold this:










Not hitting power limit really, still at +0v, +45 offset, no custom curve to limit it and +1400 memory. 

Was playing Division 2 here, but the drawn power in for example Horizon Zero Dawn is even lower. Barely hits 300w at 1.062v.


----------



## D13mass

Imprezzion said:


> Nope 450w is all you get. Even EVGA XOC BIOS is 450w max. Then again, max voltage is still 1.100v and I can't think of any load or core clock that will exceed 450w all that easily.
> 
> I can now, on the Aorus Master 2x8 pin BIOS hold this:
> 
> View attachment 2523722
> 
> 
> Not hitting power limit really, still at +0v, +45 offset, no custom curve to limit it and +1400 memory.
> 
> Was playing Division 2 here, but the drawn power in for example Horizon Zero Dawn is even lower. Barely hits 300w at 1.062v.


Thanks for your reply, but:
1. Do not understand when people write +xxx core, +xxx memory, nobody knows what do you have as stock!
2. Got it that 450 is maximum without shunt mod.
3. My card is better with downvolting, for example 2070 core 22000 memory and 975mV voltage much better, cooler and stable that stock with 2100/19000 and 1.068


----------



## Imprezzion

D13mass said:


> Thanks for your reply, but:
> 1. Do not understand when people write +xxx core, +xxx memory, nobody knows what do you have as stock!
> 2. Got it that 450 is maximum without shunt mod.
> 3. My card is better with downvolting, for example 2070 core 22000 memory and 975mV voltage much better, cooler and stable that stock with 2100/19000 and 1.068


Stock on this BIOS is 2025Mhz.
Stock BIOS is 1980.
Most efficient I can do on this card is 1950 @ 0.925 which I used to run on stock and EVGA XC3 BIOS (second BIOS slot).

This card's core is terrible. It needs a lot of volts to run above 1995Mhz so I am seriously happy I found a BIOS that allows me to run a lot more volts as I can now raise core clock by basically raising core clocks by 105Mhz which is quite a lot.

I played Horizon Zero Dawn, Battlefield 1 and Division 1 over the evening and all games held full clocks and voltage and didn't throttle even once. It also ran stable. Which never happened before above 1995Mhz lol.


----------



## Astral85

One thing I've noticed since getting my RTX 3080 is it's a real power hog. 😮


----------



## Imprezzion

Astral85 said:


> One thing I've noticed since getting my RTX 3080 is it's a real power hog. 😮


I mean, it's less power hungry then a 2080 Ti at least. So far. This card does 335-345w @ 2070Mhz core and my 2080 Ti @ 2130Mhz core (same volts, 1.093) did about 390-400w...
I am so happy my card can finally stretch it's legs with this Gigabyte Aorus Master (R1.0 2x8 pin) BIOS somehow not being nearly as much power limited as the stock Gaming OC / Eagle / Vision version. 2070Mhz core and 10900Mhz memory is great on 2x8 pin without really any throttling. Temps are great, 55c core, 70c hotspot and 72c VRAM Junction. Still on the EVGA FTW3 Hybrid block. I did place an order for a Barrow 3080/3090 Gaming OC block 3-5 days. Then all I need is a proper pump and rad and I can full-cover it finally.


----------



## Astral85

Imprezzion said:


> I mean, it's less power hungry then a 2080 Ti at least. So far. This card does 335-345w @ 2070Mhz core and my 2080 Ti @ 2130Mhz core (same volts, 1.093) did about 390-400w...
> I am so happy my card can finally stretch it's legs with this Gigabyte Aorus Master (R1.0 2x8 pin) BIOS somehow not being nearly as much power limited as the stock Gaming OC / Eagle / Vision version. 2070Mhz core and 10900Mhz memory is great on 2x8 pin without really any throttling. Temps are great, 55c core, 70c hotspot and 72c VRAM Junction. Still on the EVGA FTW3 Hybrid block. I did place an order for a Barrow 3080/3090 Gaming OC block 3-5 days. Then all I need is a proper pump and rad and I can full-cover it finally.


Maybe it's the 3x8 pin cards like mine that pull power. I just mean that now I'm playing a lot of games with RT maxed out my 3080 is pulling more power on average than my 2080 Ti ever did. The 3080 is a monster of a card.

All the best with your full cover waterblock. I have the EK Quantum on mine. GPU runs at 40-46C max, VRAM junction around 57C. Really impressed with EK's block.


----------



## Imprezzion

Astral85 said:


> Maybe it's the 3x8 pin cards like mine that pull power. I just mean that now I'm playing a lot of games with RT maxed out my 3080 is pulling more power on average than my 2080 Ti ever did. The 3080 is a monster of a card.
> 
> All the best with your full cover waterblock. I have the EK Quantum on mine. GPU runs at 40-46C max, VRAM junction around 57C. Really impressed with EK's block.


Obviously I prefer a EK or any other A tier brand over Chinesium blocks but I have to, only Alphacool makes a block for the Gigabyte Gaming / Eagle / Vision PCB and those are SO expensive I just can't justify it over the price for a Barrow block. I can get those locally as well. Byksky not unfortunately.. Now, i'm getting a 420 + 280 setup rad wise, HWLabs 420GTS XFlow + 280GTS XFlow, a Bitspower Summit M limited edition CPU block, but I still have to decide on the tubing, fittings and pump+res.


----------



## D13mass

Astral85 said:


> One thing I've noticed since getting my RTX 3080 is it's a real power hog. 😮


I noticed that it is 80% powerful than my ex 1080Ti 








And I setup 2 profiles in MSI AB: 
1. For mining: 750mV 1650/21000
2. For gaming: 975mV 2070/22000
First one used almost 99% of time and even on background (when I working) give me 250$ per month, second one only when I finished work, stopped mining and starting gaming.

First time ever when I spent 1 hour for finding these two stable settings.


----------



## MikeS3000

Imprezzion said:


> I mean, it's less power hungry then a 2080 Ti at least. So far. This card does 335-345w @ 2070Mhz core and my 2080 Ti @ 2130Mhz core (same volts, 1.093) did about 390-400w...
> I am so happy my card can finally stretch it's legs with this Gigabyte Aorus Master (R1.0 2x8 pin) BIOS somehow not being nearly as much power limited as the stock Gaming OC / Eagle / Vision version. 2070Mhz core and 10900Mhz memory is great on 2x8 pin without really any throttling. Temps are great, 55c core, 70c hotspot and 72c VRAM Junction. Still on the EVGA FTW3 Hybrid block. I did place an order for a Barrow 3080/3090 Gaming OC block 3-5 days. Then all I need is a proper pump and rad and I can full-cover it finally.


Which Bios version did you use? I have a 3080 Gaming OC and would like to try this BIOS.


----------



## Astral85

Imprezzion said:


> Obviously I prefer a EK or any other A tier brand over Chinesium blocks but I have to, only Alphacool makes a block for the Gigabyte Gaming / Eagle / Vision PCB and those are SO expensive I just can't justify it over the price for a Barrow block. I can get those locally as well. Byksky not unfortunately.. Now, i'm getting a 420 + 280 setup rad wise, HWLabs 420GTS XFlow + 280GTS XFlow, a Bitspower Summit M limited edition CPU block, but I still have to decide on the tubing, fittings and pump+res.


I'm currently using the very popular EK ZMT matte black tubing, it's very durable and looks nice. If you decide to use that I recommend thoroughly cleaning the insides of the tubing before using it. If you would like transparent tubing I can definitely recommend Mayhem's Ultra Clear tubing.


----------



## Imprezzion

MikeS3000 said:


> Which Bios version did you use? I have a 3080 Gaming OC and would like to try this BIOS.











Gigabyte RTX 3080 VBIOS


10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory




www.techpowerup.com





Mayhems Ultra Clear 10/16 (3/8-5/8) is like €13 per 3 meters so that's very reasonable and I want clear. Not going to run any colored fluid but the RGB will shine through it so.. fittings are going to be EK Quantum Torque Nickel. Just because I can get those in the right size from a guy who ordered way too many of them for half off BNIB.

Fluid, no idea.. I know EK is bad, maybe mayhems fluid or the new Corsair fluid?


----------



## MikeS3000

Imprezzion said:


> Gigabyte RTX 3080 VBIOS
> 
> 
> 10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Mayhems Ultra Clear 10/16 (3/8-5/8) is like €13 per 3 meters so that's very reasonable and I want clear. Not going to run any colored fluid but the RGB will shine through it so.. fittings are going to be EK Quantum Torque Nickel. Just because I can get those in the right size from a guy who ordered way too many of them for half off BNIB.
> 
> Fluid, no idea.. I know EK is bad, maybe mayhems fluid or the new Corsair fluid?


Thanks for pointing me in the right direction for the vbios. I actually flashed a slightly older f31 Master bios with rebar enabled (I don't think the one you linked was a rebar bios). I didn't notice any difference unfortunately. To be fair I dialed back the clocks -45 to match the stock Gaming OC BIOS. The Port Royal scores were nearly identical. I could not overclock it any higher either on core or memory in Port Royal so it didn't unlock any magic on my card. Did you test clocks with matched boost clocks on each bios to confirm improvements or just stock bios vs. stock bios?


----------



## Imprezzion

MikeS3000 said:


> Thanks for pointing me in the right direction for the vbios. I actually flashed a slightly older f31 Master bios with rebar enabled (I don't think the one you linked was a rebar bios). I didn't notice any difference unfortunately. To be fair I dialed back the clocks -45 to match the stock Gaming OC BIOS. The Port Royal scores were nearly identical. I could not overclock it any higher either on core or memory in Port Royal so it didn't unlock any magic on my card. Did you test clocks with matched boost clocks on each bios to confirm improvements or just stock bios vs. stock bios?


Clocks on stock BIOS on all Auto was 1980 with power throttling in every game. On the Master 2025 with no throttling at all. Effective clocks also miles higher. And it can sustain 1.062v now in every game when the original BIOS only manages like 0.950 at best.

I dunno. I am not using the cards fan or LED as it has a Hybrid kit on it that isn't connected to the cards fan controller. If it takes that into power budget that might explain.

EDIT: I didn't trust the results myself either. Did some flashing back and forth between the stock Gaming OC, EVGA XC3 Ultra and Aorus Master BIOS. Even tho reported clocks (and effective clocks) are way higher on the Aorus Master BIOS performance just isn't there. In Watch Dogs Legion and Division 2 (both games with easily repeatable built-in benchmarks) the FPS and score is identical between all 3 of them even with the Master reporting much higher clocks and no power throttling. Something is weird here..


----------



## chibi

Best 3080 is Strix model? 450W default bios is the max for 3080 right? No more XOC?
Water cooling and gaming only. No mining.


----------



## Imprezzion

chibi said:


> Best 3080 is Strix model? 450W default bios is the max for 3080 right? No more XOC?
> Water cooling and gaming only. No mining.


Either Strix or FTW3. Gaming-Z is good as well. But at this point I'd rather have a "bad" 3080 Ti then a "good" 3080 for basically the same price lol.


----------



## DrWaffles

dk10438 said:


> just started on OC'ing my 3080 FTW. Trying to get 19000 on the graphics score but the highest I can get is 18975. So close....


Flash the 3080 MSi suprim X bios on it.
Managed to get 20k with that on my FTW3 on a clean OS.


----------



## DrWaffles

chibi said:


> Best 3080 is Strix model? 450W default bios is the max for 3080 right? No more XOC?
> Water cooling and gaming only. No mining.


XOC BIOS for that card does exist, but you nuke your warranty. Not sure it's available to the public though..


----------



## Imprezzion

Well, I did not expect that.. all my "bad" overclocking headroom explained and fixed by one simple thing.

I noticed ever since I modded the EVGA Hybrid block on my card the hotspot was way off at like 20c above core temps but it wasn't too high so I never bothered to remount it...

Now I have a Bykski full cover due to arrive in 4 days I started modding my case for the rads and such and decided to start re doing ALL the cable and RGB/FAN cable management. For this the GPU had to come out anyway so I went on a bit of research. Pulled the block off and yes, 1/5th of the core made no contact at all...

I repasted it and bolted it back on and threw it in my case again with the rad and fast just sitting on top of it till the rest shows up.

Core is 6-8c cooler and hotspot is now within 10c of the core temp.

And low and behold I can run WAY higher overclocks now lol.. before anything over 2040 would crash no matter the voltage. I've been playing Watch Dogs Legion with RTX Ultra and everything cranked the whole evening at like 2115 @ 1.087v and it remained perfectly stable.. I even threw in some DLSS (lowers power consumption a bit so it doesn't hit throttling point) and upped to 2130 @ 1.100 and even that seems to remain stable so far..

I can't wait to have my full cover on it with even lower temps and see what this beast can really do lol.


----------



## mattxx88

Hi guys, just a question to the owners of 3080fe
for repadding is correct to use 2mm pads as you read on the net?


----------



## Pro4TLZZ

mattxx88 said:


> Hi guys, just a question to the owners of 3080fe
> for repadding is correct to use 2mm pads as you read on the net?


I've read that 2mm gelid ext on the front and 3mm gelid ext on the back works.

I have my 3080 fe coming today as well and gonna mod the pads as well so can test it

__
https://www.reddit.com/r/gpumining/comments/livilz/_/gn5fnp4


----------



## Imprezzion

My Bykski block just showed up unexpectedly! It wasn't supposed to arrive for at least 5 days but it did lol. All the pads for that are the same thickness. Just going to try the stock pads first and see what happens temp wise.


----------



## MikeS3000

mattxx88 said:


> Hi guys, just a question to the owners of 3080fe
> for repadding is correct to use 2mm pads as you read on the net?


----------



## fray_bentos

MikeS3000 said:


>


What I learned from this video is that changing pads is 100% not worth it due to risk of damage, warranty invalidation and challenge of getting a new card should anything bad happen.


----------



## Nizzen

fray_bentos said:


> What I learned from this video is that changing pads is 100% not worth it due to risk of damage, warranty invalidation and challenge of getting a new card should anything bad happen.


LOL
It's 100% worth it with good quality pads like Gelid extreme


----------



## kertsz

Hi, the lowest speed I can set my GPU fan to is 30%. What do you recommend me to do to be able to lower that speed to 20% or 15%?


----------



## mouacyk

Imprezzion said:


> My Bykski block just showed up unexpectedly! It wasn't supposed to arrive for at least 5 days but it did lol. All the pads for that are the same thickness. Just going to try the stock pads first and see what happens temp wise.


Check the torque on the standoffs on this block. Several of mine were loose and I had to tighten them after discovering abnormally high temps, which was up to 57C initially. Afterwards, I barely exceed 45C with higher overclocks above 1v and 2GHz.


----------



## Imprezzion

mouacyk said:


> Check the torque on the standoffs on this block. Several of mine were loose and I had to tighten them after discovering abnormally high temps, which was up to 57C initially. Afterwards, I barely exceed 45C with higher overclocks above 1v and 2GHz.


Yeah thanks for the tip! Saves me a drain lol.

Sitting at 42c core 53c hotspot 72c VRAM in cyberpunk with RTX psycho bouncing off the power limit around 2115Mhz ~ 1.087v.


----------



## mouacyk

Imprezzion said:


> Yeah thanks for the tip! Saves me a drain lol.
> 
> Sitting at 42c core 53c hotspot 72c VRAM in cyberpunk with RTX psycho bouncing off the power limit around 2115Mhz ~ 1.087v.


Phenomenal results, gratz! Did you have any loose standoffs?


----------



## fray_bentos

kertsz said:


> Hi, the lowest speed I can set my GPU fan to is 30%. What do you recommend me to do to be able to lower that speed to 20% or 15%?


If you use the default fan curve then the fans should (eventually) turn off completely when idle. I'm pretty sure that use of any custom curve prevents fans from turning off or below 30%. You can also try FanControl; I use it myself to control other fans, but I still think zero fan mode is disabled when any custom curve is used. It does have a nice UI that allows you to easily disable the custom GPU curve in a click or two to let the fans turn off. GitHub - Rem0o/FanControl.Releases: This is the release repository for Fan Control, a highly customizable fan controlling software for Windows.


----------



## Imprezzion

mouacyk said:


> Phenomenal results, gratz! Did you have any loose standoffs?


Yeah. Not as bad as I read in some Reddit posts and yours but 2 of the 4 had room the other 2 were tight as can be. Just snugged them up before mounting.

I did absolutely hate the way the thermal pads are done. Just some random strips of thermal pads and no manual to speak of more like here, go cut them yourself..

I also don't really get the whole goal of the small transparent washers.. where do you even use them.. no YouTube guide or whatever ever shows anyone using the washers lol. I didn't either. And I can't see of any way they are useful. All the points where you screw anything down, block or backplate, have screw pads on the PCB and the stock cooler doesn't use washers either.

The RGB is absolutely beautiful tho.

Temps and clocks while sitting in CP2077 at absolutely max load: Bykski block + Nemesis GTX 420 + 240 goes brrrrr. 44c.. at full voltage and power limits.. lol.


----------



## mouacyk

Imprezzion said:


> Yeah. Not as bad as I read in some Reddit posts and yours but 2 of the 4 had room the other 2 were tight as can be. Just snugged them up before mounting.
> 
> I did absolutely hate the way the thermal pads are done. Just some random strips of thermal pads and no manual to speak of more like here, go cut them yourself..
> 
> I also don't really get the whole goal of the small transparent washers.. where do you even use them.. no YouTube guide or whatever ever shows anyone using the washers lol. I didn't either. And I can't see of any way they are useful. All the points where you screw anything down, block or backplate, have screw pads on the PCB and the stock cooler doesn't use washers either.
> 
> The RGB is absolutely beautiful tho.
> 
> Temps and clocks while sitting in CP2077 at absolutely max load: Bykski block + Nemesis GTX 420 + 240 goes brrrrr. 44c.. at full voltage and power limits.. lol.
> View attachment 2524960


I ended up using the washers, figuring they should help increase mount pressure. So far so good, and I'm on external GTX 360 + internal slim XSPC 120. I also followed the guide here [User Review] - Gigabyte RTX 3080 Bykski | Hardwareluxx and installed a few extra thermal pads, including making a small gasket behind the die + copper heatsink on backplate. Must feel nice to be off the hybrid kit, where your VRAM + VRM both have proper active cooling now. Looking forward to your BIOS adventures again


----------



## kertsz

fray_bentos said:


> If you use the default fan curve then the fans should (eventually) turn off completely when idle. I'm pretty sure that use of any custom curve prevents fans from turning off or below 30%. You can also try FanControl; I use it myself to control other fans, but I still think zero fan mode is disabled when any custom curve is used. It does have a nice UI that allows you to easily disable the custom GPU curve in a click or two to let the fans turn off. GitHub - Rem0o/FanControl.Releases: This is the release repository for Fan Control, a highly customizable fan controlling software for Windows.


I've been testing, and FanControl doesn't allow me to lower the fan below 30% either. From what I understand it is a limitation of the BIOS, but I do not know if I can bypass that limitation with some software.


----------



## Imprezzion

kertsz said:


> I've been testing, and FanControl doesn't allow me to lower the fan below 30% either. From what I understand it is a limitation of the BIOS, but I do not know if I can bypass that limitation with some software.


Yup, BIOS has hardcoded PWM min and max. You could see if a BIOS from a different model or vendor allows lower fanspeed? Although in all my BIOS testing I've yet to come across one that allows lower then 30% lol.


----------



## jjjc_93

Hey guys, dang it has been a long time since I've posted on ocn. 

I converted my 3080 ftw3 (lhr)with the hybrid kit today as the fan noise was annoying me, even with a mild undervolt is was pretty loud and I wanted to play with the XOC bios.

All went well and core temps dropped significantly from 70c with UV to max 50c so far in 3dmark with the 450w bios. Noise is significantly improved too which was the main thing. Will do some more playing around soon and get some 3dmark runs up to see where the card sits.

One thing is I can not seem to update the bios to the Hybrid version, anybody have experience with this? When trying to update with the xoc bios from this evga thread it just states the latest firmware is installed and aborts the process. The regular ftw3 lhr xoc bios installed no problem. I guess it's not a big deal either way as I control the fans externally.


----------



## Imprezzion

jjjc_93 said:


> Hey guys, dang it has been a long time since I've posted on ocn.
> 
> I converted my 3080 ftw3 (lhr)with the hybrid kit today as the fan noise was annoying me, even with a mild undervolt is was pretty loud and I wanted to play with the XOC bios.
> 
> All went well and core temps dropped significantly from 70c with UV to max 50c so far in 3dmark with the 450w bios. Noise is significantly improved too which was the main thing. Will do some more playing around soon and get some 3dmark runs up to see where the card sits.
> 
> One thing is I can not seem to update the bios to the Hybrid version, anybody have experience with this? When trying to update with the xoc bios from this evga thread it just states the latest firmware is installed and aborts the process. The regular ftw3 lhr xoc bios installed no problem. I guess it's not a big deal either way as I control the fans externally.
> 
> View attachment 2525044


Avoid using tools like that updater, just manually flash with nvflash. You can force it to flash any version, downgrades included.

As far as my BIOS adventures go:
Water-cooling helps cause it's not drawing any power for fans or LED from the board anymore so I have a little more power limit now and the Aorus Master (2x8 pin version) BIOS does wonders on my Gaming OC. Watch Dogs Legion 1080p max RT Ultra runs at a nice and cool 44c core 55c hotspot 76c VRAM (stock Bykski pads) at 2115-2100 core 1.100-1.087v with +1400 memory (10900"Mhz") and does not power throttle. Cyberpunk does throttle very lightly when running 1080p max RT Psycho, it drops to 2055-2070 @ 1.050-1.062v from time to time, but it's miles ahead of stock Gaming OC BIOS and cooler which barely even held 1965 @ 0.987 in Cyberpunk it throttled that bad.


----------



## jjjc_93

Imprezzion said:


> Avoid using tools like that updater, just manually flash with nvflash. You can force it to flash any version, downgrades included.


Agree however I can't find the roms listed, only the updater exe.


----------



## Imprezzion

jjjc_93 said:


> Agree however I can't find the roms listed, only the updater exe.


Have you tried checking the unverified section of TechPowerUp GPU BIOS database? There's also a few ways to "unpack" an .exe (sometimes even 7z can do it) to extract the file.


----------



## TheBoom

Hi guys,

Got an opportunity to sell my 3070 FHR for a good price and was looking at getting a 3080 LHR instead. 

Would you guys say the Galax SG LHR is a good card? Or should I pay about $120 extra for a MSI Gaming Z Trio LHR? I'm not a fan of MSI to be honest, never had luck with any of their products I owned.

Galax is 320w max whereas the MSI is 380w.

I was thinking the Galax has a really low PL, maybe I could try flashing a diff bios on it.

What do you guys think?


----------



## Imprezzion

TheBoom said:


> Hi guys,
> 
> Got an opportunity to sell my 3070 FHR for a good price and was looking at getting a 3080 LHR instead.
> 
> Would you guys say the Galax SG LHR is a good card? Or should I pay about $120 extra for a MSI Gaming Z Trio LHR? I'm not a fan of MSI to be honest, never had luck with any of their products I owned.
> 
> Galax is 320w max whereas the MSI is 380w.
> 
> I was thinking the Galax has a really low PL, maybe I could try flashing a diff bios on it.
> 
> What do you guys think?


When given a choice in terms of availability always always always go for a 3x8 pin model over a 2x8 pin. The Gaming-Z is perfectly capable of 450w with the right bios where ad the Galax will top out at 350w.

Doesn't have to be Gaming-Z, Gigabyte makes several 3x8 pin models (Aorus Extreme for example), EVGA has the FTW3, many other brands like Palit, Zotac, Gainward asf make 3x8 pin models with 380-420-450w capabilities.


----------



## fray_bentos

TheBoom said:


> Hi guys,
> 
> Got an opportunity to sell my 3070 FHR for a good price and was looking at getting a 3080 LHR instead.
> 
> Would you guys say the Galax SG LHR is a good card? Or should I pay about $120 extra for a MSI Gaming Z Trio LHR? I'm not a fan of MSI to be honest, never had luck with any of their products I owned.
> 
> Galax is 320w max whereas the MSI is 380w.
> 
> I was thinking the Galax has a really low PL, maybe I could try flashing a diff bios on it.
> 
> What do you guys think?


Unless you are either a) water cooling your GPU or b) deaf, going to those high power limits is not worth it; the fan noise will be awful, and depending on your fan configuration, your CPU temps might even suffer. Personally, I take a 5% performance hit for noise going from loud (heard through headphones) to inaudible with undervolting, i.e. less power, not more power. Based on such advice, buy the cheapest 3080 you can find (unless deaf or water cooling). You get about 5% performance change for a 100 W change in power consumption at the limits of these cards; not worth it.


----------



## blurp

Completely agree with above post. I undervolt a 3080 FTW 1875 @ 0.875mv. Silent, efficient yet powerful for my need. Never exceed 320W. And rarely above 62C


----------



## TheBoom

Imprezzion said:


> When given a choice in terms of availability always always always go for a 3x8 pin model over a 2x8 pin. The Gaming-Z is perfectly capable of 450w with the right bios where ad the Galax will top out at 350w.
> 
> Doesn't have to be Gaming-Z, Gigabyte makes several 3x8 pin models (Aorus Extreme for example), EVGA has the FTW3, many other brands like Palit, Zotac, Gainward asf make 3x8 pin models with 380-420-450w capabilities.





fray_bentos said:


> Unless you are either a) water cooling your GPU or b) deaf, going to those high power limits is not worth it; the fan noise will be awful, and depending on your fan configuration, your CPU temps might even suffer. Personally, I take a 5% performance hit for noise going from loud (heard through headphones) to inaudible with undervolting, i.e. less power, not more power. Based on such advice, buy the cheapest 3080 you can find (unless deaf or water cooling). You get about 5% performance change for a 100 W change in power consumption at the limits of these cards; not worth it.





blurp said:


> Completely agree with above post. I undervolt a 3080 FTW 1875 @ 0.875mv. Silent, efficient yet powerful for my need. Never exceed 320W. And rarely above 62C


Thanks for the input. These are the only two models available right now where I am. A Zotac Trinity OC as well, which I prefer the Galax over. Based on what I’ve heard the power limits don’t increase potential as much as they used to with previous gen cards. 100 watts for 3-5% does seem excessive.

Think I will go for the Galax based on your opinions and mine. Thanks.


----------



## DrWaffles

Been doing some testing on the Strix OC.

Might be old news, but looks like the power limits get tripped if you're drawing above ~140-145w from any of the PCIE connectors, even if you're well below the factory supported 450w limit.. Can see i'm limited to around 400w in this particular title.

Seems to be that load balancing seems to be an issue on all 3080 cards, not just EVGA's FTW3 cards 
(PCIE slot was only around 40w)... Even with these high PL BIOS's the cards 40,000 shunt resistors are checking everything else isn't going above spec too..

XOC is the bottom two graphs, OEM BIOS is the top 2 graphs.
I hadn't planned to daily it, but I might and just use AB or Nvidia-smi to set it to 450/470w.. Don't particularly like the frequency/core voltage dancing around all over the place.

Interestingly, ABE004 doesn't seem to be able to read the XOC bios, was curious what was actually changed.


Anywho, conclusion i'm drawing is it's not just about total board power limit of the bios.. Pick a BIOS that breaks nvidias current monitoring..
Probably explains why the Suprim X Vbios worked so well on my old FTW3.. Half the shunts reported 9.9v instead of 12v


----------



## Imprezzion

DrWaffles said:


> Been doing some testing on the Strix OC.
> 
> Might be old news, but looks like the power limits get tripped if you're drawing above ~140-145w from any of the PCIE connectors, even if you're well below the factory supported 450w limit.. Can see i'm limited to around 400w in this particular title.
> 
> Seems to be that load balancing seems to be an issue on all 3080 cards, not just EVGA's FTW3 cards
> (PCIE slot was only around 40w)... Even with these high PL BIOS's the cards 40,000 shunt resistors are checking everything else isn't going above spec too..
> 
> XOC is the bottom two graphs, OEM BIOS is the top 2 graphs.
> I hadn't planned to daily it, but I might and just use AB or Nvidia-smi to set it to 450/470w.. Don't particularly like the frequency/core voltage dancing around all over the place.
> 
> Interestingly, ABE004 doesn't seem to be able to read the XOC bios, was curious what was actually changed.
> 
> 
> Anywho, conclusion i'm drawing is it's not just about total board power limit of the bios.. Pick a BIOS that breaks nvidias current monitoring..
> Probably explains why the Suprim X Vbios worked so well on my old FTW3.. Half the shunts reported 9.9v instead of 12v
> 
> View attachment 2525301


Would this explain why the Aorus Master (2x8 pin revision) BIOS works so much better on a Gigabyte Gaming OC? With the Gaming OC BIOS which is supposed to be 370w it throttles around 335-340w. Best it can manage is like 1950-1980 @ 0.962-0.987v. In the same benchmarks and games with the also 370w Aorus Master BIOS it pulls 2050-2085 @ 1.050-1.087v at 340-350w. Also doesn't throttle as aggressively. I can, in most games, hold 2100+ @ 1.087-1.100v now. (1080p, at 4k the story is quite different). Card is full cover water-cooled, core 47c max 56c hotspot 72c VRAM.


----------



## Astral85

Imprezzion said:


> Yeah. Not as bad as I read in some Reddit posts and yours but 2 of the 4 had room the other 2 were tight as can be. Just snugged them up before mounting.
> 
> I did absolutely hate the way the thermal pads are done. Just some random strips of thermal pads and no manual to speak of more like here, go cut them yourself..
> 
> I also don't really get the whole goal of the small transparent washers.. where do you even use them.. no YouTube guide or whatever ever shows anyone using the washers lol. I didn't either. And I can't see of any way they are useful. All the points where you screw anything down, block or backplate, have screw pads on the PCB and the stock cooler doesn't use washers either.
> 
> The RGB is absolutely beautiful tho.
> 
> Temps and clocks while sitting in CP2077 at absolutely max load: Bykski block + Nemesis GTX 420 + 240 goes brrrrr. 44c.. at full voltage and power limits.. lol.
> View attachment 2524960


What are you fitting a Nemesis GTX 420 into?


----------



## Astral85

DrWaffles said:


> Been doing some testing on the Strix OC.
> 
> Might be old news, but looks like the power limits get tripped if you're drawing above ~140-145w from any of the PCIE connectors, even if you're well below the factory supported 450w limit.. Can see i'm limited to around 400w in this particular title.
> 
> Seems to be that load balancing seems to be an issue on all 3080 cards, not just EVGA's FTW3 cards
> (PCIE slot was only around 40w)... Even with these high PL BIOS's the cards 40,000 shunt resistors are checking everything else isn't going above spec too..
> 
> XOC is the bottom two graphs, OEM BIOS is the top 2 graphs.
> I hadn't planned to daily it, but I might and just use AB or Nvidia-smi to set it to 450/470w.. Don't particularly like the frequency/core voltage dancing around all over the place.
> 
> Interestingly, ABE004 doesn't seem to be able to read the XOC bios, was curious what was actually changed.
> 
> 
> Anywho, conclusion i'm drawing is it's not just about total board power limit of the bios.. Pick a BIOS that breaks nvidias current monitoring..
> Probably explains why the Suprim X Vbios worked so well on my old FTW3.. Half the shunts reported 9.9v instead of 12v
> 
> View attachment 2525301


Where does Generic Log Viewer come from in your screenshot?


----------



## DrWaffles

Astral85 said:


> What are you fitting a Nemesis GTX 420 into?











LogViewer for GPU-Z available !


LogViewer for GPU-Z is available ! Introduction I like the tool GPU-Z, especially the logging functionality! It's the only tool I know which logs VRM temperatures reliable. But analyzing the logs can be tricky. Of course, you can open a log-file in Excel, but then you have to do a lot of steps...




www.techpowerup.com


----------



## Imprezzion

Astral85 said:


> What are you fitting a Nemesis GTX 420 into?


It's in the front of my Phanteks Enthoo Evolv X. It technically doesn't fit width wise into the PSU bay slot but a bit of massaging of the motherboard tray made it fit. Height is fine. More then fine. It also thickness wise fits push-pull if you fit the front fans between the case and front panel (yes that fits in a Evolv X, you just have to mod / cut the front panel for airflow if you do that).

3080 Gaming OC with Aorus Master BIOS, the Bykski block, passive backplate with all stock thermal pads and Prolimatech PK-3:
Far Cry 5 @ 4K HDR on all maxed settings.
Bouncing off the power limiter @ 350-355w around 2040-2070 core @ 1.043-1.062v (it's set to 2130 @ 1.100) with VRAM 10900Mhz (+1400).
Max core temp: 46.2 
Max hotspot temp: 55.9
Max VRAM Junction: 68 

If I play something a little lighter or at a lower res (far cry is upsampled res scale 2.0x, display is 1080p) like Watch Dogs Legion or Cyberpunk on native with Quality DLSS it doesn't hit limiter at all and just lets me run full on 2130 @ 1.100v. Core temps usually in the 43-44c range. 

And this is all with utter silence. Rad fans did not go over 820RPM. Water in the loop barely heats up if at all. 

I should still put better pads on the VRAM maybe cause 68 with a full cover block and a passive backplate with pads is pretty poor but these are the stock Bykski pads. I should've put Gelid's on it..


----------



## TheBoom

Anyone here with a Galax 3080 SG? Just got the card today and noticed core voltage slider is not available.

Yes, I have checked the box and it was working with my Gigabyte Gaming OC 3070 just fine.

Wonder if it’s hard locked with this card?


----------



## Astral85

Imprezzion said:


> It's in the front of my Phanteks Enthoo Evolv X. It technically doesn't fit width wise into the PSU bay slot but a bit of massaging of the motherboard tray made it fit. Height is fine. More then fine. It also thickness wise fits push-pull if you fit the front fans between the case and front panel (yes that fits in a Evolv X, you just have to mod / cut the front panel for airflow if you do that).
> 
> 3080 Gaming OC with Aorus Master BIOS, the Bykski block, passive backplate with all stock thermal pads and Prolimatech PK-3:
> Far Cry 5 @ 4K HDR on all maxed settings.
> Bouncing off the power limiter @ 350-355w around 2040-2070 core @ 1.043-1.062v (it's set to 2130 @ 1.100) with VRAM 10900Mhz (+1400).
> Max core temp: 46.2
> Max hotspot temp: 55.9
> Max VRAM Junction: 68
> 
> If I play something a little lighter or at a lower res (far cry is upsampled res scale 2.0x, display is 1080p) like Watch Dogs Legion or Cyberpunk on native with Quality DLSS it doesn't hit limiter at all and just lets me run full on 2130 @ 1.100v. Core temps usually in the 43-44c range.
> 
> And this is all with utter silence. Rad fans did not go over 820RPM. Water in the loop barely heats up if at all.
> 
> I should still put better pads on the VRAM maybe cause 68 with a full cover block and a passive backplate with pads is pretty poor but these are the stock Bykski pads. I should've put Gelid's on it..


I have the Phanteks P600S which is the same chassis as the Evolv X. I had a lot of trouble getting my rad and pump placements to work in this case. I was trying to work in the lower part of the case with the front rad but it drove me mad. I have the front rad ports facing down for easier draining. It's the reason I've decided not to get a 420mm rad which I would have to work with at the bottom of the case.

That is interesting. I am having a hard time dissipating the heat with my EK PE360 and Corsair XR5 280 rads and I run the fans in excess of 1000 RPM under load.


----------



## Imprezzion

I have it fittings up with a 240 in the top to give it some room for the fittings. Keep in mind that a Nemesis GTX radiator shroud is technically too wide to fit in the PSU Bay hole. There's a small screw holding the motherboard tray in at the start of the PSU Bay hole, if you remove this screw it will move back JUST far enough to shove it in with a bit of force. If you go 420 make sure to get a 420 that is narrower then a GTX. 

If I had to choose now I would've gotten 2 360's in stead but I had a load of rather expensive 140mm fans already so..

Pump res is a disaster to place in this case tbh. I have mine standing on the PSU Bay now with a 120mm fan mount bracket with the outer legs bent over to the inside. There's no real way to mount it on a fan or radiator especially not a thick boi with push pull.

For draining I have a EK Y splitter on the pump outlet with a EK Quantum Torque drain facing towards the window side. If I wanna drain I can put a straight EK Quantum Torque fitting on it with a piece of hose.

Also, when using soft tubing, try to get some narrower tubing to make routing easier. I had to use 16/10 (5/8 OD 3/8 ID) because the CPU block already had those fittings but 16 OD is quite stiff and hard to route around the top rad.


----------



## TheBoom

So turns out I can only unlock core voltage with Galax’s own tuning software which tbh is one of the worst I’ve come across.

Either way it doesn’t matter since I’m power limited at even 0.875v. Undervolting the card seems to give me a max of 1905-1920 stable on the core at 0.868v without any memory oc.

Yet to replace thermal paste or flash other reference bioses on it. Will probably run it stock for a while.


----------



## KeepCalm.

Hello to everyone, I am new here, but I went throuh ALL 233 pages ) have a MSI gaming x and wanted to Share my timespy GPU Score, which is 19787, took a while to get there, card is watercooled, with surprime BIOS, settings were 1018mV with 2100MHz, temp around 40 degrees (around 400ish Watts used max) when is winter coming, temps are too warm Outside)










So mainly I wanted to say thanks to everyone who inspired me or helped me to see numbers and facts and learn about trial and Errors, great to have such a Community!
Cheers


----------



## acoustic

2100 @ 1.018mv is pretty good! Nice scores!


----------



## TheBoom

I always thought the memory chips have their own voltage that cannot be adjusted by any tool but seems like I was wrong. I can get +1000 mem stable at 0.9v but undervolted to 0.85v I get around +800-900 at most game stable.

*Either way, 18017 timespy gpu score undervolted to 0.856v @ 1875core +900mem. Not bad for the second cheapest 3080 on the market with a stock bios and paste. *


----------



## TheBoom

Hands got itchy and I decided to try flashing some bioses on the card.

Since this is a LHR card I could only flash other LHR bioses without a pcie sub id mismatch.

The XC3 FTW bios worked well, up to 400w max which was a bit over the 2x8 pin limit.

Funny thing is I was still power limited. All I managed to get for that extra 80-90 watts was a whopping 15mhz more.


----------



## DrWaffles

TheBoom said:


> Hands got itchy and I decided to try flashing some bioses on the card.
> 
> Since this is a LHR card I could only flash other LHR bioses without a pcie sub id mismatch.
> 
> The XC3 FTW bios worked well, up to 400w max which was a bit over the 2x8 pin limit.
> 
> Funny thing is I was still power limited. All I managed to get for that extra 80-90 watts was a whopping 15mhz more.


Somebody correct me if I'm wrong..But

The power limit doesn't give you an extra 15mhz.
Frequency is determined by silicon quality, temps, PCB design, and core voltage for the most part.

All a higher power limit does is stop you hitting power limits as soon (Which drops your core voltage down and subsequently the clocks too)

If you're limited to 2085mhz @ 1100mv @ 320w
You'll still be limited to 2085mhz @ 1100mv @ 400w

What will happen. Is you can maintain 1100mv at a point in your benchmark that normally would be throttled back to say 1034mv (And lower clocks as a result)


----------



## RobertoSampaio

Do you know if I can find a AIO for my Gibabyte RTX 3080 Gaming OC?


----------



## Imprezzion

RobertoSampaio said:


> Do you know if I can find a AIO for my Gibabyte RTX 3080 Gaming OC?


Yes you can. I have done it. EVGA XC3 Hybrid fits almost fully, FTW3 Hybrid fits core and VRAM, not VRM and not the shroud. You have to make VRM cooling by either cutting the stock cooler or universal heatsinks.

TBH, not worth it. Just get the Bykski block from AliExpress (I recommend AegirX as a reseller on AliExpress) with a simple custom loop. The Hybrid is expensive af as well so..


----------



## dansi

any one with a SS-1000XP psu? 

works with 3080?


----------



## fray_bentos

dansi said:


> any one with a SS-1000XP psu?
> 
> works with 3080?


Why wouldn't it?


----------



## dansi

fray_bentos said:


> Why wouldn't it?


kinda worried as someone with bigger and newer version had the problem









Seasonic Platinum 1200W (100A on 12V) Crashes with RTX 3090


Got an interesting issue. I get a hard reset at game load with the Seasonic 1200W Platinum SS 1200XP3. Total wall load spikes to 500W at crash. I have a spare FPS Booster Power supply so I hooked up one 8 pin the the FPS Booster and one to the Seasonic and it all worked. I know the RTX had...




www.overclock.net


----------



## fray_bentos

dansi said:


> kinda worried as someone with bigger and newer version had the problem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seasonic Platinum 1200W (100A on 12V) Crashes with RTX 3090
> 
> 
> Got an interesting issue. I get a hard reset at game load with the Seasonic 1200W Platinum SS 1200XP3. Total wall load spikes to 500W at crash. I have a spare FPS Booster Power supply so I hooked up one 8 pin the the FPS Booster and one to the Seasonic and it all worked. I know the RTX had...
> 
> 
> 
> 
> www.overclock.net


Blimey, that's a ridiculous story for such a high power rated / platinum PSU! My 650 W gold rated EVGA is fine with a 10900KF @ 5.2 GHz OC and a 3080... Hope yours works OK.


----------



## RobertoSampaio

Does anyone have odyssey g9 monitor? what do you think?
Does the 3080 get about 100fps overall with it?


----------



## TheBoom

DrWaffles said:


> Somebody correct me if I'm wrong..But
> 
> The power limit doesn't give you an extra 15mhz.
> Frequency is determined by silicon quality, temps, PCB design, and core voltage for the most part.
> 
> All a higher power limit does is stop you hitting power limits as soon (Which drops your core voltage down and subsequently the clocks too)
> 
> If you're limited to 2085mhz @ 1100mv @ 320w
> You'll still be limited to 2085mhz @ 1100mv @ 400w
> 
> What will happen. Is you can maintain 1100mv at a point in your benchmark that normally would be throttled back to say 1034mv (And lower clocks as a result)


Oh I should have added I undervolted the card to the point where it wouldn’t throttle due to power limit. In both cases voltage was around 0.85-0.868, slightly higher with the 400w bios and max stable increase was 15mhz. Now we’re talking game stable, not benchmark stable.

I’m quite certain it’s probably a pcb design limit of some sort, hitting a power limit of 400w at 0.9v doesn’t seem right.


----------



## Imprezzion

This is why I use the Aorus Master BIOS (2x8pin rev) on my Gaming OC. It has by far the highest voltage + clocks at the same power limit of any BIOS I tested.

On most BIOS like stock Gaming OC / Vision / Eagle and EVGA XC3 Ultra, TUF and such it can only do about 0.925v @ 1950 core max before hitting power throttling. On the Aorus Master BIOS it does about 1.081v 2115Mhz at the same power limits and barely ever throttles and if it does it's down to like 1.043v 2040Mhz ish and never drops below 2000Mhz in any load or benchmarks which the stock or EVGA XC3 Ultra BIOS could never do.

Don't ask me why this happens, it makes no sense and it shouldn't as it's just another 370w 2x8pin BIOS with nothing special in it, but it works....


----------



## mouacyk

Imprezzion said:


> This is why I use the Aorus Master BIOS (2x8pin rev) on my Gaming OC. It has by far the highest voltage + clocks at the same power limit of any BIOS I tested.
> 
> On most BIOS like stock Gaming OC / Vision / Eagle and EVGA XC3 Ultra, TUF and such it can only do about 0.925v @ 1950 core max before hitting power throttling. On the Aorus Master BIOS it does about 1.081v 2115Mhz at the same power limits and barely ever throttles and if it does it's down to like 1.043v 2040Mhz ish and never drops below 2000Mhz in any load or benchmarks which the stock or EVGA XC3 Ultra BIOS could never do.
> 
> Don't ask me why this happens, it makes no sense and it shouldn't as it's just another 370w 2x8pin BIOS with nothing special in it, but it works....


I've tried out the Master BIOS, too, but also seeing it top out at 340W in some loads wasn't encouraging. My thought was that this BIOS is deployed on better binned chips, because of the higher boost. It could be why in general, you get better boost for the same power envelop. If I'm not on it already, I'd like to try it again for some winter benching.

Have you tried the Waterforce Extreme BIOS, which I'm using ATM. I had hoped that would be be unlocked, but I'm not noticing much difference either.


----------



## TheBoom

Imprezzion said:


> This is why I use the Aorus Master BIOS (2x8pin rev) on my Gaming OC. It has by far the highest voltage + clocks at the same power limit of any BIOS I tested.
> 
> On most BIOS like stock Gaming OC / Vision / Eagle and EVGA XC3 Ultra, TUF and such it can only do about 0.925v @ 1950 core max before hitting power throttling. On the Aorus Master BIOS it does about 1.081v 2115Mhz at the same power limits and barely ever throttles and if it does it's down to like 1.043v 2040Mhz ish and never drops below 2000Mhz in any load or benchmarks which the stock or EVGA XC3 Ultra BIOS could never do.
> 
> Don't ask me why this happens, it makes no sense and it shouldn't as it's just another 370w 2x8pin BIOS with nothing special in it, but it works....


Can’t test this unfortunately because the pcie sub id is 2206 whereas mine is 2216.


----------



## chibi

Finally managed to get some time to pop in my new 3080 Strix. Did a Time Spy run to test for DOA and luckily everything went well. No visual artifacts, GPU temps are to be expected and all fans working correctly. Happy to report it will now move onto the next stage and get blocked 

Edit - this 3080 Strix exhibits a lot of coil whine even with capped fps to monitor refresh rate 120 hz. Would have been a candidate for return and exchange many years ago. With this GPU market, the retailer doesn't even accept closed bnib returns, let alone opened for testing with coil whine. Oh well, hopefully next GPU I get better luck. Will have to deal with this one as is.

Time Spy - 16,896








Result not found







www.3dmark.com


----------



## OffBeatViBE

Hello, new to the forum.
I've been looking across the internet for this, but couldn't find exact information, but the LHR cards. I have RTX 3080 Eagle OC Rev2.0 with extremely low power limits, I was wondering if there is something different when it comes to flashing bios on the rev2.0 cards and can I use the bios from the Gigabyte Gaming OC Rev2.0 bios that has 370W limit in my Eagle rev2.0 ? Both 2x8 Pin and the same PCB and even the same cooler. 
Thanks!


----------



## fray_bentos

chibi said:


> Finally managed to get some time to pop in my new 3080 Strix. Did a Time Spy run to test for DOA and luckily everything went well. No visual artifacts, GPU temps are to be expected and all fans working correctly. Happy to report it will now move onto the next stage and get blocked
> 
> Edit - this 3080 Strix exhibits a lot of coil whine even with capped fps to monitor refresh rate 120 hz. Would have been a candidate for return and exchange many years ago. With this GPU market, the retailer doesn't even accept closed bnib returns, let alone opened for testing with coil whine. Oh well, hopefully next GPU I get better luck. Will have to deal with this one as is.
> 
> Time Spy - 16,896
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


You could always unvolt and not bother with the block. Undervolting does wonders for both fan noise and coil whine.



OffBeatViBE said:


> Hello, new to the forum.
> I've been looking across the internet for this, but couldn't find exact information, but the LHR cards. I have RTX 3080 Eagle OC Rev2.0 with extremely low power limits, I was wondering if there is something different when it comes to flashing bios on the rev2.0 cards and can I use the bios from the Gigabyte Gaming OC Rev2.0 bios that has 370W limit in my Eagle rev2.0 ? Both 2x8 Pin and the same PCB and even the same cooler.
> Thanks!


Are your prepared for the fan noise and potential coil whine for ~5% performance gain for an extra 100 W load? You might as well undervolt = stock performance with much less noise and heat and no risk of any BIOS flashing bricks.


----------



## Imprezzion

OffBeatViBE said:


> Hello, new to the forum.
> I've been looking across the internet for this, but couldn't find exact information, but the LHR cards. I have RTX 3080 Eagle OC Rev2.0 with extremely low power limits, I was wondering if there is something different when it comes to flashing bios on the rev2.0 cards and can I use the bios from the Gigabyte Gaming OC Rev2.0 bios that has 370W limit in my Eagle rev2.0 ? Both 2x8 Pin and the same PCB and even the same cooler.
> Thanks!


Eagle, Vision and Gaming OC have the exact same PCB. They even share waterblocks (Bykski and Alphacool for example) so yeah, possible. Also, if there is a Around Master LHR version with 2x8 pin, use that BIOS.


----------



## chibi

Is it normal for brand new stock 3080 Strix to fail Time Spy "Stress Test?"

My error to complete comes from unstable framerate.

I can pass all the normal Benchmarks, but when I got to Stress test, it fails. 

@fray_bentos - Do you have any videos you recommend as a guide to undervolting? I will try to undervolt the GPU later. Never had to do this before. Not water block is not an option, all my rigs have been water cooled and will continue for the foreseeable future. 

Time Spy - 16,995








Result not found







www.3dmark.com


----------



## fray_bentos

chibi said:


> Is it normal for brand new stock 3080 Strix to fail Time Spy "Stress Test?"
> 
> My error to complete comes from unstable framerate.
> 
> I can pass all the normal Benchmarks, but when I got to Stress test, it fails.
> 
> Time Spy - 16,995
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


The inconsistency is probably because your clock speed is fluctuating, which is probably because the frequency is changing based on temperature. Try undervolting, it will be more consistent (though being inconsistent is not _really_ a "problem" per se).


----------



## chibi

Undervolted to 0.825V and 1750MHz, still coil whine. Nothing can save this card. Too bad


----------



## fray_bentos

chibi said:


> Undervolted to 0.825V and 1750MHz, still coil whine. Nothing can save this card. Too bad


Sad to hear, and that's with a framerate cap right? It might settle down over a few months, that's my only consolement.


----------



## chibi

fray_bentos said:


> Sad to hear, and that's with a framerate cap right? It might settle down over a few months, that's my only consolement.


Correct, I've capped the fps to 117. Like I said, there's no saving this one, lol.


----------



## OffBeatViBE

Imprezzion said:


> Eagle, Vision and Gaming OC have the exact same PCB. They even share waterblocks (Bykski and Alphacool for example) so yeah, possible. Also, if there is a Around Master LHR version with 2x8 pin, use that BIOS.


Unfortunately only Gaming OC bios on techpowerup and I already flashed it and it allowed my Eagly finally hit 2000Mhz and up on the core where before I couldn't.

Also just found out that the later revisions of the Master have triple 8 PINS.


----------



## Astral85

chibi said:


> Finally managed to get some time to pop in my new 3080 Strix. Did a Time Spy run to test for DOA and luckily everything went well. No visual artifacts, GPU temps are to be expected and all fans working correctly. Happy to report it will now move onto the next stage and get blocked
> 
> Edit - this 3080 Strix exhibits a lot of coil whine even with capped fps to monitor refresh rate 120 hz. Would have been a candidate for return and exchange many years ago. With this GPU market, the retailer doesn't even accept closed bnib returns, let alone opened for testing with coil whine. Oh well, hopefully next GPU I get better luck. Will have to deal with this one as is.
> 
> Time Spy - 16,896
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


I have the Strix 3080 OC and also get a lot of coil whine from it. It's a shame.


----------



## Spectre-

Hi all,

Got a RTX 3080 Gainward Phantom non LHR (really loud cooler got custom fan profile ) upgraded from a 5700XT. Whats the expected clocks on average, got a really hot summer here ahead of us i am looking for a good experience of my X34 whilst keeping the wattage fairly low. Rignt now i have done a +50mhz core/ 500mhz mem @ 320ish watts (getting 1940ish/ 9800mhz). Just a fyi temps never go above 60C.

Thanks


----------



## Imprezzion

Spectre- said:


> Hi all,
> 
> Got a RTX 3080 Gainward Phantom non LHR (really loud cooler got custom fan profile ) upgraded from a 5700XT. Whats the expected clocks on average, got a really hot summer here ahead of us i am looking for a good experience of my X34 whilst keeping the wattage fairly low. Rignt now i have done a +50mhz core/ 500mhz mem @ 320ish watts (getting 1940ish/ 9800mhz). Just a fyi temps never go above 60C.
> 
> Thanks


That is 1890 without the offset, that's totally normal. If it's a decent sample you could get away with about 2055-2040Mhz at 0.987v at 340w ish. Depending on resolution. 4K needs more power then 1080p.


----------



## chibi

Astral85 said:


> I have the Strix 3080 OC and also get a lot of coil whine from it. It's a shame.


My 3080 Strix is the same. Even with under volt + fps cap it sings the song of it's people under any 3d load. Mine is terrible for coil whine.


----------



## Spectre-

Imprezzion said:


> That is 1890 without the offset, that's totally normal. If it's a decent sample you could get away with about 2055-2040Mhz at 0.987v at 340w ish. Depending on resolution. 4K needs more power then 1080p.


Cool 

Thanks mate, might look into getting a waterblock as well. Haven't owned a high TDP card since the R9 290x days. 

Would anyone know if EK has a block for this cant seem to find it on there website


----------



## Imprezzion

Spectre- said:


> Cool
> 
> Thanks mate, might look into getting a waterblock as well. Haven't owned a high TDP card since the R9 290x days.
> 
> Would anyone know if EK has a block for this cant seem to find it on there website


Nope the Phantom uses a custom PCB. No EK blocks for it. Not even Bykski makes a block for it and they make a block for everything lol.


----------



## Astral85

chibi said:


> My 3080 Strix is the same. Even with under volt + fps cap it sings the song of it's people under any 3d load. Mine is terrible for coil whine.


I notice you have a Seasonic Prime also. I wonder if it is the Seasonic Prime's the Srtix 3080's are disagreeing with?


----------



## vigorito

chibi said:


> Finally managed to get some time to pop in my new 3080 Strix. Did a Time Spy run to test for DOA and luckily everything went well. No visual artifacts, GPU temps are to be expected and all fans working correctly. Happy to report it will now move onto the next stage and get blocked
> 
> Edit - this 3080 Strix exhibits a lot of coil whine even with capped fps to monitor refresh rate 120 hz. Would have been a candidate for return and exchange many years ago. With this GPU market, the retailer doesn't even accept closed bnib returns, let alone opened for testing with coil whine. Oh well, hopefully next GPU I get better luck. Will have to deal with this one as is.
> 
> Time Spy - 16,896
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com



Same got LHR v2 version 3080 strix,i have a little coil whine but since im using headphones while gaming i dont mind,but i do mind a temperatures while load,any undervolt i hit 70-75,in stock easy 80c,i was thinking that this asus cards has better temps,but msi 3080 which i had several different models before asus def. Have better thermals by at least 10c,second thing im not satisifed with strix fan noise up to 50% so so but 55% and 60% is very noisy,case 011D,3x120mm bottom intake arctic p12,3x120mm side intake arctic P1,top ALF2 360,card is installed at a bottom pcie slot,im thinking to move it on the upper slot,since i had a problem with previous suprim x 3080 in lower slot,because it seems that gpus 4 model tested are having problem with noise very closed to bottom case fans,suprim x was very noisy at the bottom pcie next to the p12 fans and never wanted to raise up gpu fans more then 40% because sound is very bad,but after i move suprim x to a upper pcie slot,complete scenario has change i was able to rais gpu fan easy to 65-70% and gpu fans was very quiet so i hope this will be with strix too... Any thoughts?

Currently undervolt is 850/1850 fans 50% temp 68-77c very rare under 70,mostly 75-77,kinda feel bad for a 1850mhz clock maybe i should get 3070 then 

Set up:
X570 tomahawk
5600x
Seasonic gx-1000 gold
Gskill trident neo z 3600 cl16
Dell 1440p monitor 165hz all games are locked at 165hz in msi afterburner


----------



## acoustic

You're likely creating a ton of turbulence by having the card so close to the bottom fans, which causes a lot of extra noise.


----------



## vigorito

Im aware of that,whole case and internals looks much better when gpu is in the lower slot,will def. Try it for noise prob. But for temp i doub. Something will changed.


----------



## acoustic

At the end of the day, asking an air-cooler to dissipate 450watts, keep the card under 60c, while also being quiet .. that is a lot to ask. You can have the cool temps but then you have noise; you can have quiet, but then you have high temps. Unless you put the card under water with a full-cover block, you have to make compromises at these power levels.


----------



## vigorito

Disagree with your comment about gpu air cooling,i had all of that with Msi but that not the topic right now,only bad compromiss i did is to switch to Asus  im in a process in changing case and only available gpu at this time was strix,will try upper slot..


----------



## acoustic

vigorito said:


> Disagree with your comment about gpu air cooling,i had all of that with Msi but that not the topic right now,only bad compromiss i did is to switch to Asus  im in a process in changing case and only available gpu at this time was strix,will try upper slot..


I hope it gives you the results you're looking for!


----------



## Hirtle

Astral85 said:


> I notice you have a Seasonic Prime also. I wonder if it is the Seasonic Prime's the Srtix 3080's are disagreeing with?


I have a Seasonic Prime and a Strix. I've found that taking the extra time to make sure the water block and backplate are mounted very well will help reduce the coil whine. Check for proper mounting pressure at the core and also make sure that the VRAM, mosfets, coils, caps, and anything else the water block actually touches on the PCB has good mounting pressure as well. I spent a lot of time with mine and mixing up different thermal pad thicknesses to make sure it's right. Mine is power moded and doesn't whine during regular gaming at around 500 watts. However, when benching at over 600 watts it will still make noise.


----------



## vigorito

Huh Huh i just scored bingo changing gpu position to a upper pcie slot again (still preffer bottom one but i have issues with such set up), coil whine has completly vanished,dont know if the reason is swaping the slot or my probe coil whine period is over , noise has stopped it seems that gpu is now breading normaly  i can easy ramp up gpu fans to 70% with a decent noise like on previous msi 3080 too bad i like visualy bottom slot but its okay ,regarding the temps i have same 850mv but i raised clock to 1900mhz and im getting under 70c (previous one was 850/1850) so job is done.


----------



## Spectre-

Imprezzion said:


> Nope the Phantom uses a custom PCB. No EK blocks for it. Not even Bykski makes a block for it and they make a block for everything lol.


Found a waterblock but turns out to be $320 aud hmmm


----------



## Astral85

Hirtle said:


> I have a Seasonic Prime and a Strix. I've found that taking the extra time to make sure the water block and backplate are mounted very well will help reduce the coil whine. Check for proper mounting pressure at the core and also make sure that the VRAM, mosfets, coils, caps, and anything else the water block actually touches on the PCB has good mounting pressure as well. I spent a lot of time with mine and mixing up different thermal pad thicknesses to make sure it's right. Mine is power moded and doesn't whine during regular gaming at around 500 watts. However, when benching at over 600 watts it will still make noise.


Interesting thanks. I do take good care when installing my water block's. Perhaps my mounting pressure is slightly low? Can anyone point out what part the coil whine actually comes from? Here is my Strix 3080 when it was stripped:


----------



## Panchovix

Wondering 2 things, came back after some time lol.

1. There still isn't something like a XOC VBIOS for the 3080 yet, right? 
2. Has someone shunt modded a TUF 3080 OC here? Did you shunt everything or only power connectors? Looking to buy maybe some 8mOhm/10mOhm resistors, I have some 5mOhm ones but that's proably way too much power


----------



## ssgwright

Panchovix said:


> Wondering 2 things, came back after some time lol.
> 
> 1. There still isn't something like a XOC VBIOS for the 3080 yet, right?
> 2. Has someone shunt modded a TUF 3080 OC here? Did you shunt everything or only power connectors? Looking to buy maybe some 8mOhm/10mOhm resistors, I have some 5mOhm ones but that's proably way too much power


yes on my TUF I shunted every single shunt with 5m0hm been going strong for what 6 months now I believe


----------



## Panchovix

ssgwright said:


> yes on my TUF I shunted every single shunt with 5m0hm been going strong for what 6 months now I believe


Did you add a waterblock, or just stock cooler? It is pretty good, but for 5mOhm shunts, the max power is 700W I think lol


----------



## 9_realmz

anybody got a copy of that 3080 strix bios for lhr card??


----------



## Panchovix

BTW, does someone know the 3080 non-LHR 2x8 pin VBIOS with higher RPM fan speed? My TUF for example has 3000RPM max, but it may be another GPU/VBIOS which has more RPM, like 3300 or 3500RPM for example lol

Want to test because, reasons (?


----------



## Panchovix

So at the end took the risk and shunt modded my 3080 TUF (with 8mOhm resistors on top of the 5mOhm ones, so x1.62 factor), for now everything is fine, and max PCI-E power is about 75W lol, the rest is 190-200W per 8-pin.

Stock cooler is pretty good luckily, and it can mantain the temps without issues.

Wondering why the power limit, though it only happens on TimeSpy Extreme, at 480W and it has to downclock itself.

I still have to tinker the 5800X, but honestly don't know how, I know 5800X good scores are 13K on TimeSpy for example, but I'm pretty below that

On a colder day will try to reach 20k on TimeSpy Graphics score, since when I did those tests, it was like 32°C ambient temp.

Results links:


TimeSpy: I scored 17 792 in Time Spy
TimeSpy Extreme: I scored 8 959 in Time Spy Extreme
Port Royal: I scored 12 992 in Port Royal
FireStrike: I scored 37 485 in Fire Strike
FireStrike Extreme: I scored 22 506 in Fire Strike Extreme

















































* FireStrike Ultra: I scored 12 234 in Fire Strike Ultra


----------



## Falkentyne

Panchovix said:


> So at the end took the risk and shunt modded my 3080 TUF (with 8mOhm resistors on top of the 5mOhm ones, so x1.62 factor), for now everything is fine, and max PCI-E power is about 75W lol, the rest is 190-200W per 8-pin.
> 
> Stock cooler is pretty good luckily, and it can mantain the temps without issues.
> 
> Wondering why the power limit, though it only happens on TimeSpy Extreme, at 480W and it has to downclock itself.
> 
> I still have to tinker the 5800X, but honestly don't know how, I know 5800X good scores are 13K on TimeSpy for example, but I'm pretty below that
> 
> On a colder day will try to reach 20k on TimeSpy Graphics score, since when I did those tests, it was like 32°C ambient temp.
> 
> Results links:
> 
> 
> TimeSpy: I scored 17 792 in Time Spy
> TimeSpy Extreme: I scored 8 959 in Time Spy Extreme
> Port Royal: I scored 12 992 in Port Royal
> FireStrike: I scored 37 485 in Fire Strike
> FireStrike Extreme: I scored 22 506 in Fire Strike Extreme
> View attachment 2530568
> 
> 
> View attachment 2530571
> 
> View attachment 2530573
> 
> View attachment 2530569
> 
> View attachment 2530572
> 
> View attachment 2530570
> 
> * FireStrike Ultra: I scored 12 234 in Fire Strike Ultra


You're limited by the NVVDD and MSVDD voltage rails. They have their own power limit and ignore shunts.
You need to increase NVVDD voltage with hardware tools (e.g. Elmor EVC2SX) to bypass it, unless you can cross flash a vbios that has all these internal limits removed. They're some sort of bios calibrated rail with how much current NVVDD (or MSVDD) is allowed to pull at 1.10v. Shunts have nothing to do with this. Check HWInfo64 (GPU-Z is useless for this). Are you getting a "normalized" TDP% value that is close to the same value as the TDP slider? If so, that's what's throttling you. NVVDD and MSVDD current limits report to TDP Normalized and cannot be seen on any power rail in hwinfo64.

As far as I know, only the Galax 3080 Ti HOF 1kw bios and the Kingpin 3090 1kw bios has these limits removed and I'm only 100% sure with respect to the Kingpin bios. I have no idea if such a bios exists for the 3080 (e.g. a 3080 kingpin 1kw bios with all limits removed that you can flash on your TUF and just run at 50-60% TDP slider and not get internal rail limits).


----------



## Panchovix

Falkentyne said:


> Are you getting a "normalized" TDP% value that is close to the same value as the TDP slider?


I do! So that was the reason, it theorically should pull about 550W, but near 480W I get a Pwr limit anyways, damn, that's new on Ampere right? I do not remember that limit on Turing


Falkentyne said:


> I have no idea if such a bios exists for the 3080


Sadly it doesn't for now  wish we had a Galax HOF VBIOS


----------



## Falkentyne

Panchovix said:


> I do! So that was the reason, it theorically should pull about 550W, but near 480W I get a Pwr limit anyways, damn, that's new on Ampere right? I do not remember that limit on Turing
> 
> 
> Sadly it doesn't for now  wish we had a Galax HOF VBIOS


It's how the hardware is designed. If a bios doesn't exist with unlimited NVVDD/MSVDD current limits, you need to mod to increase NVVDD voltage (since an increase in voltage also increases this limit). MSI Afterburner only changes VID, not voltage.


----------



## Panchovix

Managed to improve my TimeSpy Graphics score and Port Royal on the 3080 TUF shunted (now it drawed tops 490W, and as Falkentyne says, the power limit can occurs anyways), the ambient temp was 25°C~ instead of 35°C~, so the card boosted itself a lit better; checked PCI-E max draw as well and it was 79W, so pretty happy with that.

So cloose for that 20K graphics score on TimeSpy, wondering how lower ambient temp I need for that lol









Result link: I scored 17 893 in Time Spy

On Portroyal, just went a little above 13k, so pretty happy about that









Result link: I scored 13 090 in Port Royal

Now just wondering something, does someone knows what happens when you flash a 3x8 pin VBIOS on a shunted card with 2xpin? Does it still get power limited at any point?


----------



## mouacyk

@Panchovix Pin-cross flashing like that will only give you 66% (2/3) of the power limit of the unlocked BIOS. At least this is what others have reported, who tried.


----------



## kertsz

Imprezzion said:


> Yup, BIOS has hardcoded PWM min and max. You could see if a BIOS from a different model or vendor allows lower fanspeed? Although in all my BIOS testing I've yet to come across one that allows lower then 30% lol.


What's new with this? Have you found a solution? Thanks greetings


----------



## Panchovix

Managed to get 2085-2100Mhz on games at 4K maxed graphics, on my shunt modded TUF 3080 on air; it get's tops 69°C at 450W, mostly on Vanguard, even then I still get power limited (internal rails).

*Man, I'm so envy of some of you guys that have a RTX 3080/3080Ti/3090 and can mantain 2205+Mhz on games!*

Some pics examples below, on the spoiler [See GPU2, ignore GPU1 since it's a 3060Ti which I have at the same time lol)



Spoiler































































Thanks to @
*KingEngineRevUp *for linking this [Official] NVIDIA RTX 3080 Ti Owner's Club, since it netted me a little more overclock headroom on my card!


----------



## Imprezzion

I'm still on 2070-2055 core with 10900(+1400) memory on mine ever since I waterblocked it a few months back. Been solid so far in Far Cry 6, Forza Horizon 5 and Battlefield 2042 (RTX On). No crashes or weirdness. It can run most games at 2100 and maintain that without limiting at 1080p but it won't run Ray tracing at 2100. It needs to drop to 2070 for RT not to crash the driver. Power usage with the Aorus Master BIOS and no RGB or fans powered by the card, all external, is around 320-335w and it never throttles on 1080p. If I use resolution scaling / DSR to render at 150-200% it will throttle down to 2010-1980 ish.


----------



## Panchovix

Imprezzion said:


> I'm still on 2070-2055 core with 10900(+1400) memory on mine ever since I waterblocked it a few months back. Been solid so far in Far Cry 6, Forza Horizon 5 and Battlefield 2042 (RTX On). No crashes or weirdness. It can run most games at 2100 and maintain that without limiting at 1080p but it won't run Ray tracing at 2100. It needs to drop to 2070 for RT not to crash the driver. Power usage with the Aorus Master BIOS and no RGB or fans powered by the card, all external, is around 320-335w and it never throttles on 1080p. If I use resolution scaling / DSR to render at 150-200% it will throttle down to 2010-1980 ish.


I see, for example at 1080p I can maintain 2115-2130Mhz if there is no RTX, and 2085-2100Mhz if there's RTX, since with the shunt mod, with RTX the temps and power consumption skyrocket lol, well, to 69°C max, which is still a ton for overclocking.

I'm gonna try to see how much lower voltage I can go and get 2010Mhz as well, I hope less than 1V but it sounds unlikely


----------



## Imprezzion

Panchovix said:


> I see, for example at 1080p I can maintain 2115-2130Mhz if there is no RTX, and 2085-2100Mhz if there's RTX, since with the shunt mod, with RTX the temps and power consumption skyrocket lol, well, to 69°C max, which is still a ton for overclocking.
> 
> I'm gonna try to see how much lower voltage I can go and get 2010Mhz as well, I hope less than 1V but it sounds unlikely


I can do 2010-1980 @ 0.987v below 44c. If it gets above 44c it gets unstable with RTX in for example Cyberpunk and I need to boost voltage a bin or 2.

My normal load temps at max OC are around 48-51c. I got my rad fans tuned very very low at like 550RPM idle and 700-750RPM load. Advantage of having such thick radiators with push pull is the silence as I can run super low fan speeds and keep the water temp delta well in check.

I also just bought a B550-XE + 5900X to replace my 10900KF which will go to a buddy who still games on a 2500K which really does not handle FH5 all that well and BF2042 is unplayable on that thing so..


----------



## Panchovix

Imprezzion said:


> I can do 2010-1980 @ 0.987v below 44c. If it gets above 44c it gets unstable with RTX in for example Cyberpunk and I need to boost voltage a bin or 2.
> 
> My normal load temps at max OC are around 48-51c. I got my rad fans tuned very very low at like 550RPM idle and 700-750RPM load. Advantage of having such thick radiators with push pull is the silence as I can run super low fan speeds and keep the water temp delta well in check.
> 
> I also just bought a B550-XE + 5900X to replace my 10900KF which will go to a buddy who still games on a 2500K which really does not handle FH5 all that well and BF2042 is unplayable on that thing so..


Wow those temps are amazing! Just managed to do 2025Mhz at 0.981V, on higher temps it settles at 2010Mhz (at 65°C), toasty lol.

Man what a great purchase! That B550-XE is pretty good, and a 5900X even better; I have a 570X TUF and a 5800X, so far have been working pretty fine, but haven't tried to overclock the CPU much at the moment

On BF2042 my 5800X bottlenecks my 3080 sometimes at 1080p and 1440p, basically both of those resolution run at the same FPS lol


----------



## Imprezzion

Panchovix said:


> Wow those temps are amazing! Just managed to do 2025Mhz at 0.981V, on higher temps it settles at 2010Mhz (at 65°C), toasty lol.
> 
> Man what a great purchase! That B550-XE is pretty good, and a 5900X even better; I have a 570X TUF and a 5800X, so far have been working pretty fine, but haven't tried to overclock the CPU much at the moment
> 
> On BF2042 my 5800X bottlenecks my 3080 sometimes at 1080p and 1440p, basically both of those resolution run at the same FPS lol


Yeah so does the 10900KF. But that's more BF2042's terrible optimization. I just run 1080p Ultra textures with the rest low / off. That way it's at least smooth. All ultra even with ray tracing on has about the same average FPS but random hickups and drops. 

I saw the 5900X on black Friday sale and the B550-XE is just a mini Crosshair basically with 14+2 VRM and great audio and such for like 200 bucks so kind of a no brainer.

Only downside is I have to buy a new CPU block and drain the entire loop. Don't have AMD mounts for my block and can't buy them separately as the block is harvested from a EK Phoenix AIO..

I will obviously be OCing the 5900X so.


----------



## 9_realmz

strix 3080 , i can run 2100+ in wwz but benchmarks it clocks down to 1980/2010 is this because of the drivers? whay cant we oc our cards?
everytime it hits above 380wat it clocks down..i have great temps 58g/60mem/70hot i re thermaled the pads..


----------



## Panchovix

9_realmz said:


> strix 3080 , i can run 2100+ in wwz but benchmarks it clocks down to 1980/2010 is this because of the drivers? whay cant we oc our cards?
> everytime it hits above 380wat it clocks down..i have great temps 58g/60mem/70hot i re thermaled the pads..


Probably because power limit in your case, though I think the Strix can use 450W if I'm not wrong


----------



## 9_realmz

Panchovix said:


> Probably because power limit in your case, though I think the Strix can use 450W if I'm not wrong


yes it 370 with 450 max im convfused why its doing this..changed thermal pads. unless there is a temp i cant see on the card thats causing it? at 380+ it clocks down..to 1995/2010 and stays there.
under worls war z it remans at what i set it to. 2100+ easy 2140....
benchmarks it drops to 2000.


----------



## Panchovix

9_realmz said:


> yes it 370 with 450 max im convfused why its doing this..changed thermal pads. unless there is a temp i cant see on the card thats causing it? at 380+ it clocks down..to 1995/2010 and stays there.
> under worls war z it remans at what i set it to. 2100+ easy 2140....
> benchmarks it drops to 2000.


Probably it still getting power limited, are you using the voltage slider to reach 1.1V also? Maybe it is even a voltage limit.
The Strix is a top tier card, I think it shouldn't have temp issues


----------



## 9_realmz

Panchovix said:


> Probably it still getting power limited, are you using the voltage slider to reach 1.1V also? Maybe it is even a voltage limit.
> The Strix is a top tier card, I think it shouldn't have temp issues


what drivers are you using? so i up everything in afterburner..still clocks down to about 2k. except in world war z it stays 2100+ cause it 3oo wat or lower.. if i hit 370 wat it drops. I have 850 wat asus psu 10900k 5ghz, rogue z 490e


----------



## Panchovix

9_realmz said:


> what drivers are you using? so i up everything in afterburner..still clocks down to about 2k. except in world war z it stays 2100+ cause it 3oo wat or lower.. if i hit 370 wat it drops. I have 850 wat asus psu 10900k 5ghz, rogue z 490e


For benchmarks I use 472.12. I sometimes hit 480W on TimeSpy on my 3080 TUF shunt modded lol, it gets power limited by internal rails


----------



## 9_realmz

Panchovix said:


> For benchmarks I use 472.12. I sometimes hit 480W on TimeSpy on my 3080 TUF shunt modded lol, it gets power limited by internal rails


I relized it was super position on 1080p extreme causing the card to downclock to hard! under normal or high it seems normal clocking. in world war z it clocks to 2100 easy cause i guess the game doesnt really stress the gpu 200-300wat max. i guess i can now move on..here is my timespy,

maxed power 21+/91c and 500+ mem./ gpu clock not touched.will bounce around 2100-2000.


----------



## Panchovix

9_realmz said:


> I relized it was super position on 1080p extreme causing the card to downclock to hard! under normal or high it seems normal clocking. in world war z it clocks to 2100 easy cause i guess the game doesnt really stress the gpu 200-300wat max. i guess i can now move on..here is my timespy,
> 
> maxed power 21+/91c and 500+ mem./ gpu clock not touched.will bounce around 2100-2000.
> 
> View attachment 2534487


Oh I see, my clocks drops on TimeSpy 2nd test for example, this is my best graphics score at the moment, it starts at 2160Mhz and it drops to 2115-2130Mhz by temps, but to 2085Mhz on the 2nd test because power limit










I'm so close to 20k that it hurts, I don't know what else to try to reach that lol, just hope lower ambient temps (Ignore my CPU score, is pretty low but welp, haven't overclocked it either)


----------



## Xipe

can i put this gaming oc bios gigabyte Gigabyte.RTX3080.10240.210607 on my eagle? this have the resalizable bar activate?


----------



## 9_realmz

Panchovix said:


> Oh I see, my clocks drops on TimeSpy 2nd test for example, this is my best graphics score at the moment, it starts at 2160Mhz and it drops to 2115-2130Mhz by temps, but to 2085Mhz on the 2nd test because power limit
> 
> View attachment 2534501
> 
> 
> I'm so close to 20k that it hurts, I don't know what else to try to reach that lol, just hope lower ambient temps (Ignore my CPU score, is pretty low but welp, haven't overclocked it either)


oh and i flashed gigabyte auros xtreme bios on my strix also. seems no different than strix bios.


----------



## 9_realmz

Xipe said:


> can i put this gaming oc bios gigabyte Gigabyte.RTX3080.10240.210607 on my eagle? this have the resalizable bar activate?


if your not sure i wouldn't do it. do you know how to do nvflash?


----------



## Xipe

9_realmz said:


> if your not sure i wouldn't do it. do you know how to do nvflash?


yes, im flashing before my eagle to gaming oc bios. But yesterday i flash with gigabyte utility for get the resalizable bar, but the tdp goes down to 345. i need 370 and resalizable bar


----------



## 9_realmz

Xipe said:


> yes, im flashing before my eagle to gaming oc bios. But yesterday i flash with gigabyte utility for get the resalizable bar, but the tdp goes down to 345. i need 370 and resalizable bar


you can flasj almost amny bios as long as its same as gpu. like 3080 must be 3080 bios. also lhr or non lhr...i think u can flash arus master..


----------



## Panchovix

Xipe said:


> yes, im flashing before my eagle to gaming oc bios. But yesterday i flash with gigabyte utility for get the resalizable bar, but the tdp goes down to 345. i need 370 and resalizable bar


For 2x8 pins, the absolute max is about 350W.

The only way to change that is shunt modding, VBIOS flashing won't help (believe me, I did try with my TUF 3080 before shunt modding it)


----------



## 9_realmz

Panchovix said:


> For 2x8 pins, the absolute max is about 350W.
> 
> The only way to change that is shunt modding, VBIOS flashing won't help (believe me, I did try with my TUF 3080 before shunt modding it)


what shunts did you get ? on air? i might want to is there a giude for strix 3080

caause i cant seem to get past more than 4oo ish wats.. wheres the other 50 wats??


----------



## Imprezzion

Eagle, Vision and Gaming OC BIOS are interchangeable. 

I'm an idiot btw. I had the Master R1.0 BIOS on my Gaming OC and it performed way better power limit wise and such and shower way higher clocks but never ran very well frame time wise or whatever. I figured it out finally today.. 

It only shows 8GB VRAM at a reduced bus speed when using that BIOS... 

Flashed stock BIOS, 10GB at full bandwidth. EVGA XC3 Ultra Hydro Copper BIOS, full 10GB.
Newer Master R1.0 BIOS: 8GB.. weird..

But yeah, I got my full 10GB back now finally...


----------



## Panchovix

9_realmz said:


> what shunts did you get ? on air? i might want to is there a giude for strix 3080
> 
> caause i cant seem to get past more than 4oo ish wats.. wheres the other 50 wats??


I did use 8mOhm shunts, ERJ-M1WSF8M0U, shunted every resistor.
And yes, Stock cooler for now, max temps I've seen is 69°core/82°Junction with 30°C ambient temps
When the ambient temp is lower, the max is like 60°C core/76° Junction, though at fans at 2500+ RPM

I do get to use 475W in heavy-rasterization games, and 500-550W on heavy ray traced games.


Imprezzion said:


> Eagle, Vision and Gaming OC BIOS are interchangeable.
> 
> I'm an idiot btw. I had the Master R1.0 BIOS on my Gaming OC and it performed way better power limit wise and such and shower way higher clocks but never ran very well frame time wise or whatever. I figured it out finally today..
> 
> It only shows 8GB VRAM at a reduced bus speed when using that BIOS...
> 
> Flashed stock BIOS, 10GB at full bandwidth. EVGA XC3 Ultra Hydro Copper BIOS, full 10GB.
> Newer Master R1.0 BIOS: 8GB.. weird..
> 
> But yeah, I got my full 10GB back now finally...


What, really? That can happen? Man getting just the 8 of the 10GB, that sounds painful


----------



## 9_realmz

Panchovix said:


> I did use 8mOhm shunts, ERJ-M1WSF8M0U, shunted every resistor.
> And yes, Stock cooler for now, max temps I've seen is 69°core/82°Junction with 30°C ambient temps
> When the ambient temp is lower, the max is like 60°C core/76° Junction, though at fans at 2500+ RPM
> 
> I do get to use 475W in heavy-rasterization games, and 500-550W on heavy ray traced games.
> 
> What, really? That can happen? Man getting just the 8 of the 10GB, that sounds painful


you can pretty much flash any bios to test it..i have the gigabyte bios xtreme on my strix. im trying the evga xoc cause its 380/450 waat instead of 370/450 right now.


----------



## Imprezzion

Panchovix said:


> I did use 8mOhm shunts, ERJ-M1WSF8M0U, shunted every resistor.
> And yes, Stock cooler for now, max temps I've seen is 69°core/82°Junction with 30°C ambient temps
> When the ambient temp is lower, the max is like 60°C core/76° Junction, though at fans at 2500+ RPM
> 
> I do get to use 475W in heavy-rasterization games, and 500-550W on heavy ray traced games.
> 
> What, really? That can happen? Man getting just the 8 of the 10GB, that sounds painful


Yeah I couldn't run Far Cry 6 with High Res textures, it errors low VRAM, Horizon 5 had bad pop-in and such due to lack of VRAM.. because the games expect a 3080 with 10GB and thus allocate 10GB but it can't use it.


----------



## Panchovix

9_realmz said:


> you can pretty much flash any bios to test it..i have the gigabyte bios xtreme on my strix. im trying the evga xoc cause its 380/450 waat instead of 370/450 right now.


My card, even if it's shunted, it gets very buggy with a 3x8 pin VBIOS, so I'm using stock TUF VBIOS, and with the shunts, it uses ups to 550~W, so good pretty far

Wondering though, does someone know max RPM of the EVGA XC3 3080 VBIOS?


----------



## Imprezzion

Panchovix said:


> My card, even if it's shunted, it gets very buggy with a 3x8 pin VBIOS, so I'm using stock TUF VBIOS, and with the shunts, it uses ups to 550~W, so good pretty far
> 
> Wondering though, does someone know max RPM of the EVGA XC3 3080 VBIOS?


23xx as far as I can remember.


----------



## 9_realmz

Where is the strix 3080 shut mod guide?
my card will run 2130 easy..in world war z dam solid also. benchmarks it gets bouncy. i guess benchmarks stress more than games?
testing 2100 1.25v and will see how low i can go on volt at 2100. i think 145mhz is about the max unless i throw 1.06-8v under current cooling to go higher.
I have yet to see anything much over 410wat.


----------



## Panchovix

9_realmz said:


> Where is the strix 3080 shut mod guide?
> my card will run 2130 easy..in world war z dam solid also. benchmarks it gets bouncy. i guess benchmarks stress more than games?
> testing 2100 1.25v and will see how low i can go on volt at 2100. i think 145mhz is about the max unless i throw 1.06-8v under current cooling to go higher.
> I have yet to see anything much over 410wat.


I don't know if there is a guide for the Strix, probably haven't been done since the Strix can use 450W I think (or in theory at least lol)

Just now managed to do a bit better on my scores, man I'm missing ONE point to reach 20k on graphics score lol, it hurts
(Ignore the 3060Ti there, it wasn't used for the benchmark)

















I scored 18 128 in Time Spy


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com




















I scored 13 207 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Panchovix

It seems 466.63 driver is pretty good for TimeSpy and Port Royal, meanwhile 496.76 is pretty, amazingly good for Firestrike, got some good scores (Again ignore the 3060Ti haha)

















I scored 38 518 in Fire Strike


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





















I scored 22 952 in Fire Strike Extreme


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





















I scored 12 407 in Fire Strike Ultra


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## 9_realmz

Panchovix said:


> It seems 466.63 driver is pretty good for TimeSpy and Port Royal, meanwhile 496.76 is pretty, amazingly good for Firestrike, got some good scores (Again ignore the 3060Ti haha)
> 
> View attachment 2534925
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 38 518 in Fire Strike
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2534926
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 22 952 in Fire Strike Extreme
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2534927
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 407 in Fire Strike Ultra
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


I dont want to pony up the money for 3dmark add ons , im cheap i guess.


----------



## SPL Tech

shunt mod is where the money is at. dont screw around with all these BIOSes, just shunt it. or more specifically, use the silver pen so you can write your shunt on.


----------



## 9_realmz

SPL Tech said:


> shunt mod is where the money is at. dont screw around with all these BIOSes, just shunt it. or more specifically, use the silver pen so you can write your shunt on.


which silver pen , where ?


----------



## 9_realmz

Panchovix said:


> It seems 466.63 driver is pretty good for TimeSpy and Port Royal, meanwhile 496.76 is pretty, amazingly good for Firestrike, got some good scores (Again ignore the 3060Ti haha)
> 
> View attachment 2534925
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 38 518 in Fire Strike
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2534926
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 22 952 in Fire Strike Extreme
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2534927
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 12 407 in Fire Strike Ultra
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


i just ordered those shunts today from mouser ? mouszer?
hopefully ill get before the weekend.! hope its the right .008 mhom
*Current Sense Resistors - SMD 2512 0.008ohm 1% Curr Sense AEC-Q200 

*


----------



## SPL Tech

Panchovix said:


> I do get to use 475W in heavy-rasterization games, and 500-550W on heavy ray traced games.


I see this all the time and really question this. First off, HOW do you know? How are you measuring it? You need direct measurement methods like directly from the PSU via an intelligent PSU. If you're using one of those power meters that your computer plugs into, keep in mind that's the total draw of everything in the computer plus the inefficiency of the PSU, so GPU is just a portion of that.

I have a 3080 which I have shunt moded and core and mem maxed out. I also have a 11700k @ 5 GHZ. with that I have tons of LED lights, it's all watercooled, I have like 15 USB accessories and 7 fans. Even so, when I am running a benchmark like Superposition, I am getting about 630W draw from the PSU. That is for everything. Every component that is plugged into the PSU. If I run both Superspoition and a CPU benchmark at the same time, I get about 750w draw which is not even remotely realistic to any level of actual use. None of those figures occur in gaming. The most GPU intensive game I have ever seen is Cyberpunk and in that game I get maybe 600w draw MAX, but mostly in the 400s and 500s. In something like BF 2142, I get like high 400s to low 500s. Maybe spikes to the mid or high 500s. Keep in mind, that's not GPU, that's the entire output of the PSU. Also, I run all 4k resolution.

So when someone says they get 600W draw on a 3080 of any type, I'd like to see proof as I'd never seen that on my card despite a shunt mod. Maybe something like the Kingping which allows legit vcore voltage boost to 1.21 or whatever they allow. But the card needs to be BIOS unlocked for that.


----------



## Panchovix

9_realmz said:


> i just ordered those shunts today from mouser ? mouszer?
> hopefully ill get before the weekend.! hope its the right .008 mhom
> *Current Sense Resistors - SMD 2512 0.008ohm 1% Curr Sense AEC-Q200
> 
> *


Nice! I hope you manage to get it working without issues


SPL Tech said:


> I see this all the time and really question this. First off, HOW do you know? How are you measuring it? You need direct measurement methods like directly from the PSU via an intelligent PSU. If you're using one of those power meters that your computer plugs into, keep in mind that's the total draw of everything in the computer plus the inefficiency of the PSU, so GPU is just a portion of that.
> 
> I have a 3080 which I have shunt moded and core and mem maxed out. I also have a 11700k @ 5 GHZ. with that I have tons of LED lights, it's all watercooled, I have like 15 USB accessories and 7 fans. Even so, when I am running a benchmark like Superposition, I am getting about 630W draw from the PSU. That is for everything. Every component that is plugged into the PSU. If I run both Superspoition and a CPU benchmark at the same time, I get about 750w draw which is not even remotely realistic to any level of actual use. None of those figures occur in gaming. The most GPU intensive game I have ever seen is Cyberpunk and in that game I get maybe 600w draw MAX, but mostly in the 400s and 500s. In something like BF 2142, I get like high 400s to low 500s. Maybe spikes to the mid or high 500s. Keep in mind, that's not GPU, that's the entire output of the PSU. Also, I run all 4k resolution.
> 
> So when someone says they get 600W draw on a 3080 of any type, I'd like to see proof as I'd never seen that on my card despite a shunt mod. Maybe something like the Kingping which allows legit vcore voltage boost to 1.21 or whatever they allow. But the card needs to be BIOS unlocked for that.


Honestly I was measuring it with a power meter, I was just substracting CPU consumption (by software) and a little more because hdds/fans.
On TimeSpy for example, at 4K, it gets power limited at that "theoretical" 475W, and on PortRoyal and Quake RTX, it gets power limited again, at that "theoretical" 550W; in both cases it uses 1.1V for the max clocks, though when it drops, it does to 1.068V or even 1V sometimes. 

For some reason pure ray tracing does push less the internal rails vs pure rasterization, not sure why.

On games, haven't seem that power consumption either, except maybe in Vanguard at 4K where, is the only game which I actually get power limited at 470W and the clocks drops from 2115Mhz to 2010Mhz for example, not even BF2042 does that for my 3080

So maybe you're right and it's using less than it should, but that's because internal limits, no way to go higher at least without volt modding 

(BTW I shunted everything with 8mOhm, so at 1.62x of the max TDP in my case is 567W, I haven't seen it though, gets power limited by internal rails before that)


----------



## 9_realmz

Panchovix said:


> Nice! I hope you manage to get it working without issues
> 
> 
> Honestly I was measuring it with a power meter, I was just substracting CPU consumption (by software) and a little more because hdds/fans.
> On TimeSpy for example, at 4K, it gets power limited at that "theoretical" 475W, and on PortRoyal and Quake RTX, it gets power limited again, at that "theoretical" 550W; in both cases it uses 1.1V for the max clocks, though when it drops, it does to 1.068V or even 1V sometimes.
> 
> For some reason pure ray tracing does push less the internal rails vs pure rasterization, not sure why.
> 
> On games, haven't seem that power consumption either, except maybe in Vanguard at 4K where, is the only game which I actually get power limited at 470W and the clocks drops from 2115Mhz to 2010Mhz for example, not even BF2042 does that for my 3080
> 
> So maybe you're right and it's using less than it should, but that's because internal limits, no way to go higher at least without volt modding
> 
> (BTW I shunted everything with 8mOhm, so at 1.62x of the max TDP in my case is 567W, I haven't seen it though, gets power limited by internal rails before that)






i will not be shunting the pcie slot. a shunt calculator should say so..5mhom is 50 % ,i remember reading.


----------



## SPL Tech

9_realmz said:


> i will not be shunting the pcie slot. a shunt calculator should say so..5mhom is 50 % ,i remember reading.


you should shunt the PCIe. it will limit you hard. the card does not limit power on a per port basis. as soon as ANY of the ports hit the power max, the entire card gets throttled. so in theory you could have zero limit on the PCIe plugs, but as soon as you hit 75w on the PCIe slot, the card will start power limiting. it doesent matter. the pcie slot can handle it on any real mobo. the only boards it might be an issue are super entry level crappy boards.


----------



## Panchovix

9_realmz said:


> i will not be shunting the pcie slot. a shunt calculator should say so..5mhom is 50 % ,i remember reading.


If you don't shunt the PCI-E, as soon as the PCI-E slot gets power limited, all your other power rails will be getting power limited too, as if you didn't shunt mod any resistor.

That's why I went with 8mOhm, is basically safe on PCI-E, on my TUF 3080 via software measuring, the PCI-E hasn't reach more than 80W, the work mostly is done with the 8 power pin connectors (ASUS has a very good way to balance the power and to reduce PCI-E power consumption)


----------



## mouacyk

Anybody get native-res image sharpening to work without distorting aspect ratio and limiting refresh rate on latest 496 drivers? Why the f did they have to change it up. I don't use GE, but I hear the normal image sharpening is moved there permanently as a filter?


----------



## 9_realmz

Panchovix said:


> If you don't shunt the PCI-E, as soon as the PCI-E slot gets power limited, all your other power rails will be getting power limited too, as if you didn't shunt mod any resistor.
> 
> That's why I went with 8mOhm, is basically safe on PCI-E, on my TUF 3080 via software measuring, the PCI-E hasn't reach more than 80W, the work mostly is done with the 8 power pin connectors (ASUS has a very good way to balance the power and to reduce PCI-E power consumption)



ok you answered my question i was goinfg to ask. if you used the 8mohm. i got 10 of them, should be enough.ddi you use a specific video and do the mod that way or from multiply videos?
which one is a good giude just as refresher..ive watched a few already and no what im doing also with solder gun and ocing..since yr 2000 , remember the pencil mods for vdroop? or on amd chips??ive done a few volt mods with resisters on grafx cards aklso. some other things i mess with and sell on mercari.


Check out my store on Mercari! https://www.mercari.com/u/669092192/


----------



## Panchovix

9_realmz said:


> ok you answered my question i was goinfg to ask. if you used the 8mohm. i got 10 of them, should be enough.ddi you use a specific video and do the mod that way or from multiply videos?
> which one is a good giude just as refresher..ive watched a few already and no what im doing also with solder gun and ocing..since yr 2000 , remember the pencil mods for vdroop? or on amd chips??ive done a few volt mods with resisters on grafx cards aklso. some other things i mess with and sell on mercari.
> 
> 
> Check out my store on Mercari! https://www.mercari.com/u/669092192/


About the Strix I don't know sadly, but on my TUF I did use 6 shunts, the Strix has 3x8 pin so maybe it will use 8-9 shunts, so you're safe.

I did soldered, I don't know the pencil mods that people do so I just shunted  maybe someone can help you there


----------



## 9_realmz

mouacyk said:


> Anybody get native-res image sharpening to work without distorting aspect ratio and limiting refresh rate on latest 496 drivers? Why the f did they have to change it up. I don't use GE, but I hear the normal image sharpening is moved there permanently as a filter?


you use such big words you must be smart...hehei think you forget were just monkeys watching a youtube video and copying it inhopes we get 3 more fps/


Panchovix said:


> About the Strix I don't know sadly, but on my TUF I did use 6 shunts, the Strix has 3x8 pin so maybe it will use 8-9 shunts, so you're safe.
> 
> I did soldered, I don't know the pencil mods that people do so I just shunted  maybe someone can help you there


no i have lots solder experience . solder braid to clean up solder if you ever remove the shunts.


----------



## 9_realmz

Panchovix said:


> About the Strix I don't know sadly, but on my TUF I did use 6 shunts, the Strix has 3x8 pin so maybe it will use 8-9 shunts, so you're safe.
> 
> I did soldered, I don't know the pencil mods that people do so I just shunted  maybe someone can help you there


ok i just shunted with 8mohm , 7 thers was 5 at the 3 power connections. 1 pcie and 1 on the back of the card. it work! i was afraid i missed a good solder and i didnt have a multimeter to test them after soldering. its a bit hard to get some of the ones that are in a small .I noticed i missed like 1 the solder didnt fall to the bottom resister. glad i checked them before i hooked the card back up.
now i set 100 power and seems to not go over 250-300 ish .
my hot spot reading shootsmup to almost 90 tho...dam iwas like around 75 max..maybe i will pull the card and check pads and paste. ? in the extreme super postion i got 12k it was set to 2100 and dropped to 2085, at 1.1v 500+ ram at least it worked ! not sure if ill get to cracking open the card tonight. for a pad check, a few pads came off , maybe i forgot one.


----------



## Panchovix

9_realmz said:


> ok i just shunted with 8mohm , 7 thers was 5 at the 3 power connections. 1 pcie and 1 on the back of the card. it work! i was afraid i missed a good solder and i didnt have a multimeter to test them after soldering. its a bit hard to get some of the ones that are in a small .I noticed i missed like 1 the solder didnt fall to the bottom resister. glad i checked them before i hooked the card back up.
> now i set 100 power and seems to not go over 250-300 ish .
> my hot spot reading shootsmup to almost 90 tho...dam iwas like around 75 max..maybe i will pull the card and check pads and paste. ? in the extreme super postion i got 12k it was set to 2100 and dropped to 2085, at 1.1v 500+ ram at least it worked ! not sure if ill get to cracking open the card tonight. for a pad check, a few pads came off , maybe i forgot one.
> View attachment 2535413
> 
> View attachment 2535414


Pretty nice man! since you used shunt on all of them, 300W in software reading = 480W real power consumption 

Your temps are higher but 90°C hotspot is kinda on the hot side (but not by much honestly), you can try to run the benchmarks with fan at more speeds for example, and changing the paste/thermal pads maybe, though hotspot is the hottest spot in the card which is not necessarily the VRAM, it can be a VRM for example.

The good thing about shunting is maintaining clocks at higher resolutions, so that's pretty good.


----------



## 9_realmz

Panchovix said:


> Pretty nice man! since you used shunt on all of them, 300W in software reading = 480W real power consumption
> 
> Your temps are higher but 90°C hotspot is kinda on the hot side (but not by much honestly), you can try to run the benchmarks with fan at more speeds for example, and changing the paste/thermal pads maybe, though hotspot is the hottest spot in the card which is not necessarily the VRAM, it can be a VRM for example.
> 
> The good thing about shunting is maintaining clocks at higher resolutions, so that's pretty good.




1.25 v didnt lock up. 1.1v it locked in timespy. but it could be overheating?
this is 130/1.25v but half way through it drops to 2100 probbly cause hotspot hit almost 90. yeah very steady clocks now. ill check my card and add a fon on top of the backplate. over the wekend i doo more benching ans see what happens.


----------



## 9_realmz

i repasted the gpu , i think i didnt tighten the 2 screws with a spring. anyhow still spiking to 85 or so but not 90.
ill check back plate later.. so its about 10c hotter at times under stress the hotspot.


----------



## 9_realmz

Panchovix said:


> Pretty nice man! since you used shunt on all of them, 300W in software reading = 480W real power consumption
> 
> Your temps are higher but 90°C hotspot is kinda on the hot side (but not by much honestly), you can try to run the benchmarks with fan at more speeds for example, and changing the paste/thermal pads maybe, though hotspot is the hottest spot in the card which is not necessarily the VRAM, it can be a VRM for example.
> 
> The good thing about shunting is maintaining clocks at higher resolutions, so that's pretty good.


Been searching for hotspot temps after shunt mod and its little hard to find..it might be possible the higher temps are the components in the pic due to increase WATS.? the heat sink asus uses doesn't have all these components thermal padded or have a plate that contacts them., circled in RED. maybe these parts are getting warmer. ive taken the card apart like 4 tiimes reseating and the gpu doesnt have the best heat paste imprint. slight off to a side. after the shunt mod something is throwing it off so slightly i think. but its best i can. again its hotpot not gpu temp . it does seem to stop around 85c tho not into the 90"s now. only in benchmarking for a few spots when under duress up to 85 c. gaming is has low temps.


----------



## Panchovix

9_realmz said:


> Been searching for hotspot temps after shunt mod and its little hard to find..it might be possible the higher temps are the components in the pic due to increase WATS.? the heat sink asus uses doesn't have all these components thermal padded or have a plate that contacts them., circled in RED. maybe these parts are getting warmer. ive taken the card apart like 4 tiimes reseating and the gpu doesnt have the best heat paste imprint. slight off to a side. after the shunt mod something is throwing it off so slightly i think. but its best i can. again its hotpot not gpu temp . it does seem to stop around 85c tho not into the 90"s now. only in benchmarking for a few spots when under duress up to 85 c. gaming is has low temps.
> 
> View attachment 2535653
> 
> 
> View attachment 2535654


It can be those VRMs, though in my case my hotspot temps went from 70°c max to 80°max, so more watts can def heat up more the VRMs


----------



## 9_realmz

Panchovix said:


> It can be those VRMs, though in my case my hotspot temps went from 70°c max to 80°max, so more watts can def heat up more the VRMs


what do you keep your power percentage at for gaming? mines at 100%


----------



## Panchovix

9_realmz said:


> what do you keep your power percentage at for gaming? mines at 100%


100% as well, so in my case, 340W*1.62 = 551W max, haven't reached that power consumption yet though


----------



## 9_realmz

if you ever want to make thermal epoxy ..it conducts heat well and you can control its strength.its non elect conductive.i applied straight to resisters and things. back side of a gpu [the caps] directly.
, and on vrms . if you make it a tad soft you can razor blade it off easy. in between memory chip and epoxy layer. [example.]
also can be more permanent gluing.!


----------



## Kutalion

Hey guys, been having issues with my EVGA 3080 XC3. 

The following happens, when the GPU is at stock, and I enter most games, the first time there's a high-fps scene usually the monitors turn off, I hear the "unplugging" windows sound. I reset the PC and everything works again until a certain problematic stage of the game. Usually not in the menu that is FPS capped, but something else. Usually, it's followed by a very loud coil whine at the moment of turning off.

If I put the GPU at less than 55% power limit, everything works fine no matter the FPS and the GPU never turns off the displays. 

Did anybody encounter/fix this problem?


----------



## Panchovix

Kutalion said:


> Hey guys, been having issues with my EVGA 3080 XC3.
> 
> The following happens, when the GPU is at stock, and I enter most games, the first time there's a high-fps scene usually the monitors turn off, I hear the "unplugging" windows sound. I reset the PC and everything works again until a certain problematic stage of the game. Usually not in the menu that is FPS capped, but something else. Usually, it's followed by a very loud coil whine at the moment of turning off.
> 
> If I put the GPU at less than 55% power limit, everything works fine no matter the FPS and the GPU never turns off the displays.
> 
> Did anybody encounter/fix this problem?


Hmm that sounds interesting, if it does happen at stock but when limiting the power consumption it goes without issues, it can mean various things:

The game is badly optimized, or the driver has a bug (likely the first maybe, not much the second one)
Your card, when you are underpowering, you're undervolting as well, so, it could mean that at higher voltages the card chokes, and in this case, the card would be faulty and you would have to need to RMA.

As extra option something to do with the monitor itself, but I doubt it.

A good way to test it, is on other games, if the same occurs I would say it's a GPU issue, but if it happens only in that game, it could be a driver/game bug only.

I haven't encountered that issue with my 3080, but that did happened to me a year ago with a 2070S, same symptoms but on multiple games, and at the end the GPU was faulty, and did a RMA.


----------



## fray_bentos

Kutalion said:


> Hey guys, been having issues with my EVGA 3080 XC3.
> 
> The following happens, when the GPU is at stock, and I enter most games, the first time there's a high-fps scene usually the monitors turn off, I hear the "unplugging" windows sound. I reset the PC and everything works again until a certain problematic stage of the game. Usually not in the menu that is FPS capped, but something else. Usually, it's followed by a very loud coil whine at the moment of turning off.
> 
> If I put the GPU at less than 55% power limit, everything works fine no matter the FPS and the GPU never turns off the displays.
> 
> Did anybody encounter/fix this problem?


Could be PSU issue. Also cap your max framerate to the maximum refresh rate of your screen in Nvidia control panel. Does it still crash if you do that?


----------



## Kutalion

Panchovix said:


> Hmm that sounds interesting, if it does happen at stock but when limiting the power consumption it goes without issues, it can mean various things:
> 
> The game is badly optimized, or the driver has a bug (likely the first maybe, not much the second one)
> Your card, when you are underpowering, you're undervolting as well, so, it could mean that at higher voltages the card chokes, and in this case, the card would be faulty and you would have to need to RMA.
> 
> As extra option something to do with the monitor itself, but I doubt it.
> 
> A good way to test it, is on other games, if the same occurs I would say it's a GPU issue, but if it happens only in that game, it could be a driver/game bug only.
> 
> I haven't encountered that issue with my 3080, but that did happened to me a year ago with a 2070S, same symptoms but on multiple games, and at the end the GPU was faulty, and did a RMA.


It happens in most games, just some more than others. Yes, I have a profile that undervolts and underpowers. Thing is the card used to run fine on stock and then just started doing the weird thing at one point. Sheesh, so probably a faulty GPU then, that's gonna be a pain 




fray_bentos said:


> Could be PSU issue. Also cap your max framerate to the maximum refresh rate of your screen in Nvidia control panel. Does it still crash if you do that?


It's SF750 platinum, so I kind of doubt it. The card worked beforehand on the SF600 Platinum fine. Power consumption never goes even close to over 600W.


----------



## Panchovix

Finally managed to surpass 20000 graphics score on TimeSpy, even if it's by 2 points, it's honest work lol
















I scored 18 150 in Time Spy


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





Also improved my Port Royal and TimeSpy Extreme scores, the only thing I did was doing the benchs at like 3AM with colder ambient temps lol

















I scored 9 118 in Time Spy Extreme


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





















I scored 13 209 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





Power consumption was between 470 and 492W~, can't do more because internal rails :c


----------



## fray_bentos

Kutalion said:


> It's SF750 platinum, so I kind of doubt it. The card worked beforehand on the SF600 Platinum fine. Power consumption never goes even close to over 600W.


 ... and what about my suggestion of putting a framerate cap at the max of the monitor refresh rate all the time in Nvidia control panel? Does that stop the crashing?


----------



## mrjayviper

any info (PCB/power phases/etc) on the MSI Sea Hawk AIO 3080? What 3rd party/custom waterblock is suitable? Thank you


----------



## Kutalion

fray_bentos said:


> ... and what about my suggestion of putting a framerate cap at the max of the monitor refresh rate all the time in Nvidia control panel? Does that stop the crashing?


I gotta say it seems to have done the trick. I installed new drivers and added that limit. So far I haven't managed to crash it once. Ran a couple games briefly, ran superposition 4k, neither managed to get it to black screen so far. Thanks!


----------



## acoustic

Kutalion said:


> I gotta say it seems to have done the trick. I installed new drivers and added that limit. So far I haven't managed to crash it once. Ran a couple games briefly, ran superposition 4k, neither managed to get it to black screen so far. Thanks!


I recommend using framerate limiter in RTSS rather than NVCP or in-game limiters.

You shouldn't be crashing with the framerate unlocked - imo, you're covering up an issue by locking the framerate.


----------



## Kutalion

acoustic said:


> I recommend using framerate limiter in RTSS rather than NVCP or in-game limiters.
> 
> You shouldn't be crashing with the framerate unlocked - imo, you're covering up an issue by locking the framerate.


While I do totally agree with you, it seemed to have stayed stable even at 99% utilization. I think it could be it trying to boost to frequencies the chip just can't handle and stay stable. Be it because of only 1 cluster of those small caps behind the GPU core or something else.

Edit: tried 107% power limit, and went through SuperPosition 8k without a hitch.


----------



## fray_bentos

Kutalion said:


> While I do totally agree with you, it seemed to have stayed stable even at 99% utilization. I think it could be it trying to boost to frequencies the chip just can't handle and stay stable. Be it because of only 1 cluster of those small caps behind the GPU core or something else.
> 
> Edit: tried 107% power limit, and went through SuperPosition 8k without a hitch.


Glad that the fps cap works for now. It isn't unheard of for some cards not to like really high framerates. In fact there were recent stories of Amazon's "New World" MMO killing 3090s with high framerates in menus. I had already noticed on my own card if I run a benchmark that runs with thousands of fps it always gave coil whine, which didn't seem very healthy/good for the card. Accordingly, I always set a max fps cap to avoid any unwanted hardware failure in the future. Sure, all _should_ be OK, but why risk it, when my monitor can't display more than 144 / 165 Hz anyway?!

Also, if the GPU was working fine, but then started crashing with high fps, I wonder if it could be dust on the caps at the back of the card. Even just a tiny amount can cause problems. Disconnect the power and hold the power button down to discharge before giving it a powerful dust out, preferably with a blower machine "electric duster".


----------



## 9_realmz

its a nvidia driver or monitor driver issue. [ refresh rate thing?] load the proper monitor driver , change NVIDIA drivers.

also try setting s curve and a set voltage.so you dontget spikes.


----------



## zebra_hun

Panchovix said:


> Finally managed to surpass 20000 graphics score on TimeSpy, even if it's by 2 points, it's honest work lol
> View attachment 2537304
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 150 in Time Spy
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Also improved my Port Royal and TimeSpy Extreme scores, the only thing I did was doing the benchs at like 3AM with colder ambient temps lol
> 
> View attachment 2537305
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 9 118 in Time Spy Extreme
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2537306
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 13 209 in Port Royal
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Power consumption was between 470 and 492W~, can't do more because internal rails :c
> 
> View attachment 2537307


Very nice scores in 3DMark.
I have a Gigabyte Gaming OC Waterblock, with original BIOS. Max Watt is 350.
I share my results here:

































These are my best


----------



## Panchovix

zebra_hun said:


> Very nice scores in 3DMark.
> I have a Gigabyte Gaming OC Waterblock, with original BIOS. Max Watt is 350.
> I share my results here:
> View attachment 2538333
> 
> View attachment 2538332
> 
> View attachment 2538331
> 
> View attachment 2538334
> 
> 
> These are my best


Those are pretty good scores for 350W! Imagine if you had power limit headroom, it would boost above 2205Mhz


----------



## zebra_hun

Panchovix said:


> Those are pretty good scores for 350W! Imagine if you had power limit headroom, it would boost above 2205Mhz


Thx. I am not a pc guru, for me is not a simple BIOS to change. My card has 2x8 pin connector. My friends on Hungarian forum told me, 3x8 pin vga is better to find BIOS.
No matter, i am happy with my card, and 350-360W limit, right me. I play only Bf5 und 1942 lol, 20 years old... Bf2042 is for everyone sh...t. I use Riva Tuner cap at 144 fps, Bf5 is 200-240W, and i have super min (low) fps.

BfV msi ab log:









Allcore 4.8GHz my 10850k, ram oc'ed to 4133MHz, and Battlefield 5, 1% low fps is fix, stable 144fps. Everythings ultra, but without Ray Tracing, it's off. 2560×1440. This is a gamer PC
One more thx


----------



## mrjayviper

Any thoughts/feedback on the waterforce (not AIO)? How's the quality of the block? Thanks


----------



## Imprezzion

mrjayviper said:


> Any thoughts/feedback on the waterforce (not AIO)? How's the quality of the block? Thanks


The block is fine, OEM is probably Bykski so it's quite good quality, it's just the card it's on is quite meh for the price as it isn't a 3x8 pin.


----------



## Panchovix

Has someone tested windows 11 vs windows 10 on 3DMark on their 3080? Wondering if there's any optimization or something lol


----------



## mrjayviper

Imprezzion said:


> The block is fine, OEM is probably Bykski so it's quite good quality, it's just the card it's on is quite meh for the price as it isn't a 3x8 pin.


I saw that that air-cooled aorus master is 3-pin (higher boost) than a watercooled one which is disappointing.

thanks


----------



## Imprezzion

mrjayviper said:


> I saw that that air-cooled aorus master is 3-pin (higher boost) than a watercooled one which is disappointing.
> 
> thanks


Yeah I mean if you shell out the cash for a factory water-cooled card one can assume you're going to at least offset overclock it but it is limited to 345w effective on 2x8 so it barely hits 2000Mhz boost without throttling like mad.

I have the regular gaming oc (same PCB) with a Bykski block but that was just because that was the one I could get for MSRP back then otherwise I would've never gotten a 2x8 pin..


----------



## nikoli707

Panchovix said:


> Finally managed to surpass 20000 graphics score on TimeSpy, even if it's by 2 points, it's honest work lol
> View attachment 2537304
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 150 in Time Spy
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> Also improved my Port Royal and TimeSpy Extreme scores, the only thing I did was doing the benchs at like 3AM with colder ambient temps lol
> 
> View attachment 2537305
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 9 118 in Time Spy Extreme
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2537306
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 13 209 in Port Royal
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Power consumption was between 470 and 492W~, can't do more because internal rails :c
> 
> View attachment 2537307


please, as a favor for all.... right click on those "bolded" pcie/8pin1/8pin2/total wattage in hwinfo64... go to "customize values", then in the bottom right, put in the 1.xx multiplier for your shunts so you can get a proper power reading without having to do maths.


----------



## nikoli707

what is the max daily wattages recommended for a 2x8pin? is everybody having no problems running pulling 120w on long 24h+ renders from the pcie lane?


----------



## Panchovix

nikoli707 said:


> please, as a favor for all.... right click on those "bolded" pcie/8pin1/8pin2/total wattage in hwinfo64... go to "customize values", then in the bottom right, put in the 1.xx multiplier for your shunts so you can get a proper power reading without having to do maths.


It is already done, these bold values are the real values (8mOhm shunts = 1.62x multiplier of power), I edited them exactly for that haha




nikoli707 said:


> what is the max daily wattages recommended for a 2x8pin? is everybody having no problems running pulling 120w on long 24h+ renders from the pcie lane?


I've used until 80-90W on my PCI-E daily and no issues so far, though that's mostly thanks to ASUS power balancing which is pretty good, even shunt modded I haven't seen it above 90W.
Though, if you use 5mOhm shunts, it may reach that.
But ASUS does not have fuses either.


----------



## SPL Tech

Anyone know what the lowest ohm resistors you can use for a shunt mod? The idea would be to remove the power limit entirely to the best degree possible. Evga XC3.


----------



## 9_realmz

Panchovix said:


> Has someone tested windows 11 vs windows 10 on 3DMark on their 3080? Wondering if there's any optimization or something lol


go back a page to my postings, i ran win 11 with those benches , im now back to 10 . not really any different


----------



## 9_realmz

SPL Tech said:


> Anyone know what the lowest ohm resistors you can use for a shunt mod? The idea would be to remove the power limit entirely to the best degree possible. Evga XC3.


think it depends on your card, but i read 8mhm is safe. lower and you run the risk of a locked card..


----------



## KeepCalm.

wanted to share the *20226 points *in timespy graphics run

(The cold temps really helped this morning for this one, water was 5 degrees only, window was open 
the card is MSI Gaming X Trio, watercooled in custom loop, surprime BIOS 
must admit, was a bit afraid of condensation, but I let the PC warm up very slowly, so all was fine


----------



## Panchovix

KeepCalm. said:


> wanted to share the *20226 points *in timespy graphics run
> 
> (The cold temps really helped this morning for this one, water was 5 degrees only, window was open
> the card is MSI Gaming X Trio, watercooled in custom loop, surprime BIOS
> must admit, was a bit afraid of condensation, but I let the PC warm up very slowly, so all was fine
> 
> View attachment 2539545


Wow that's pretty nice! At which clocks were you running the benchmark?

Man, on my shunted TUF at 490W~ I get 20001 graphics score, but on air lol, it seems I really need to watercool it, but I do not want a custom loop, just AIO 

Also ambient temps here in Chile, Santiago (20°C min - 35°C max) do not help lol


----------



## Imprezzion

Panchovix said:


> Wow that's pretty nice! At which clocks were you running the benchmark?
> 
> Man, on my shunted TUF at 490W~ I get 20001 graphics score, but on air lol, it seems I really need to watercool it, but I do not want a custom loop, just AIO
> 
> Also ambient temps here in Chile, Santiago (20°C min - 35°C max) do not help lol


EVGA Hybrid XC3 or FTW3 AIO will fit the core, it fits any 3080 as the hole spacing is all the same. The VRAM plate will also fit. The VRM heatsink and shroud will not. That requires some hackery or ghetto modding or copper heatsinks with thermal epoxy or whatever. I ran that setup for a while on my Gigabyte Gaming OC till my custom loop was complete. Core at 350w hardly ever reached 49c.

You can buy them directly from EVGA or maybe even local retailers. Some in the Netherlands stocked them.


----------



## Panchovix

SPL Tech said:


> Anyone know what the lowest ohm resistors you can use for a shunt mod? The idea would be to remove the power limit entirely to the best degree possible. Evga XC3.


5mOhm on tops of the 5mOhm default ones (so aka 2x the stock power limit, 2.5mOhm equivalent) is the lowest I would go, below that I think you may risk the PCI-E slot lol.
Also I think EVGA uses fuses on their PCI-E slot shunt and Power shunts (being max 120W and 240W respectively to the fuse doesn't trip), you have to be sure that a shunt will have the power values below that.


----------



## SPL Tech

Panchovix said:


> 5mOhm on tops of the 5mOhm default ones (so aka 2x the stock power limit, 2.5mOhm equivalent) is the lowest I would go, below that I think you may risk the PCI-E slot lol.
> Also I think EVGA uses fuses on their PCI-E slot shunt and Power shunts (being max 120W and 240W respectively to the fuse doesn't trip), you have to be sure that a shunt will have the power values below that.


I think there is a limit to how much power these can draw even with a shunt mod though. I find even with shunt mods I get PWR listed as the limitation in GPUZ. I think maybe there is some additional internal limits that are measuring power? What is usually the limiting factor with a shunt mod to stop additional performance?


----------



## Panchovix

SPL Tech said:


> I think there is a limit to how much power these can draw even with a shunt mod though. I find even with shunt mods I get PWR listed as the limitation in GPUZ. I think maybe there is some additional internal limits that are measuring power? What is usually the limiting factor with a shunt mod to stop additional performance?


Internal rails, by @* Falkentyne *it seems something besides the power delivery itself limits the card.

It happens to me at 490W~or so


----------



## SPL Tech

Panchovix said:


> Internal rails, by @* Falkentyne *it seems something besides the power delivery itself limits the card.
> 
> It happens to me at 490W~or so


So then in theory if you are shunt moding, there should be no difference in performance between a 2x8 pin lower end card vs a 3x8 card with a higher BIOS power limit? At that point, with shunt mods they would both be the same?


----------



## Panchovix

SPL Tech said:


> So then in theory if you are shunt moding, there should be no difference in performance between a 2x8 pin lower end card vs a 3x8 card with a higher BIOS power limit? At that point, with shunt mods they would both be the same?


Basically, but the top 3080 with 3x8pin has 450W as max limit, so shunting a 2x8 pin card still gets you a little more performance?


----------



## SPL Tech

Panchovix said:


> Internal rails, by @* Falkentyne *it seems something besides the power delivery itself limits the card.
> 
> It happens to me at 490W~or so


Any idea how to bypass those internal rails? The idea would to have no power limit and have the card limited only by vcore and nothing else. My 850w PSU is not getting worked hard enough and I need to keep her happy. With shunt mods on my 3080 and a 11700k at 5.00 ghz, I average around 610w PSU output (not power draw at the outlet, PSU output) with 100% GPU usage in battlefield 2042. It can spike up to the mid to high 600s, but typically chills in the low 600s or high 500s during gameplay. Idle power usage is about 100w output from the PSU, so I am thinking that the card is probably drawing around 450w, maybe a bit more depending on how much the CPU is using.


----------



## nikoli707

Is there an agreed upon range for gddr6x overclocking? i know it has ECC so i can push it to +1200mhz or even more but it wont crash and will just lose performance. right now i have it at 20800mhz but i need to test lower and higher to see where my chips limits are.


----------



## Panchovix

nikoli707 said:


> Is there an agreed upon range for gddr6x overclocking? i know it has ECC so i can push it to +1200mhz or even more but it wont crash and will just lose performance. right now i have it at 20800mhz but i need to test lower and higher to see where my chips limits are.


Some people can do +2000Mhz and still get performance, silicon VRAM lottery I guess


----------



## KeepCalm.

Panchovix said:


> Wow that's pretty nice! At which clocks were you running the benchmark?
> 
> Man, on my shunted TUF at 490W~ I get 20001 graphics score, but on air lol, it seems I really need to watercool it, but I do not want a custom loop, just AIO
> 
> Also ambient temps here in Chile, Santiago (20°C min - 35°C max) do not help lol


Dont worry, I dont think this run is representing the real world performance, 
here is the link to the run:








I scored 17 290 in Time Spy


AMD Ryzen 5 5600X, NVIDIA GeForce RTX 3080 x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com





And yes, youre warm temperatures are challanging i assume. But dont damage your hardware, its not worth it just for a score


----------



## Imprezzion

nikoli707 said:


> Is there an agreed upon range for gddr6x overclocking? i know it has ECC so i can push it to +1200mhz or even more but it wont crash and will just lose performance. right now i have it at 20800mhz but i need to test lower and higher to see where my chips limits are.


It's easy to test. Get a game or a benchmark and pause it / stand still in a spot where FPS stays the same. I used The Division 2 while standing at the White House spawn looking at a static spot. FPS was a static 128 at +0. Then just raise mem clock and check FPS. If it goes up, good. If it stays the same, you might be getting close to ECC, if it drops or starts to fluctuate wildly, too high and ECC kicks in.

It's a very rough way to test it but it can give you some idea of the ballpark of where the VRAM likes to be. 

For my card it scales very well up to +1200, +1400 sees no regression but also no gains. +1500 starts to fluctuate and +1600 fluctuated wildly. I went from 128 to 135 from +0 to +1200.

I just called it there and run +1200-1250 now on daily profile. 

Obviously verify scaling with multiple game tests and benchmarks after.


----------



## ssgwright

Here's my port and timespy, not too shabby for a 2 pin TUF:









I scored 12 991 in Port Royal


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





















I scored 19 827 in Time Spy


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Falkentyne

SPL Tech said:


> Anyone know what the lowest ohm resistors you can use for a shunt mod? The idea would be to remove the power limit entirely to the best degree possible. Evga XC3.


If you NOT using an EVC2SX device to access true NVVDD and MSVDD voltages (MSI Afterburner only controls NVVDD VID, not GPU Vcore) 8 mOhm seems most reasonable.
You can use 5 mOhms safely but your card will never be able to reach the theoretical limits anyway because the internal rails linked to NVVDD and MSVDD voltages will power throttle you somewhere around 520W, regardless of the shunts. This seems to be some sort of current limit limitation tied to voltage rather than an actual power limit, as raising one of those two voltages with the Elmor device allows more power to be drawn.

Someone said that the Strix has internal limits for NVVDD and MSVDD higher than normal reference boards, this was mentioned on hardwareluxx, something like the Strix defaults are equal to the Kingpin 3090 card, with the dip switches enabled for MSVDD and MSVDD), as when they modded 5 mohm on the Strix, they did not get power limited in Timespy Extreme, while all other shunt modded 5 mohm boards (besides KP) hit a power limit long before the 700W-800W theoretical limit of 5 mohm shunts (350W * 2 or 400W * 2).

If you have access to a true XOC Bios with all power limits removed completely and no internal limits on the voltage rails, just use that (the 3090 1kw kingpin bios is the only one that has that; The 3080 Ti XOC Galax 1kw Bios I have no idea if the internal rail limits tied to vcore are also removed ala Kingpin 1kw).

Anyway, with 5 mohm shunts, on a non Strix/non Kingpin card, you will end up having a 'dead zone' from 75% to 100% on the TDP slider, where values between 75-100% will not increase power limits at all. That's because the internal NVVDD and MSVDD rail limits will be hit (in most games) somewhere around 520W (remember?), which is about 75% on the TDP slider, so going from 75-100% will not do anything. Basically, there is a default and maximum value for the NVVDD and MSVDD rails, and they don't seem to respond at all to values BELOW 100% based on the slider (only TDP itself seems to respond below 100%, which is obvious). However, those rail limits DO RESPOND to values past 100% if the TDP slider can go past 100%, they seem to scale, so the higher the TDP slider goes past 100%, the more power you can draw, e.g. a 121% slider will allow you to exceed those internal rail limits by 121%, if and only if the "default" and "maximum" values for these are not set to the same value (no way to determine without vbios code debugging if they are).


----------



## SPL Tech

Falkentyne said:


> he 3090 1kw kingpin bios is the only one that has that; The 3080 Ti XOC Galax 1kw Bios I have no idea if the internal rail limits tied to vcore are also removed ala Kingpin 1kw).


I dont think they are. I have that BIOS and I am only hitting about 100w above the stock FTW3 BIOS. So like 500s watt max power draw before I get vrel/ vop. But with the 1k galax BIOS, I never see green PWR in GPUZ. it'as laways blue or orange.


----------



## Imprezzion

Well, that was interesting. With the Tomb Raider trilogy having been free on Epic since Christmas I decided to play those games again and even tho Tomb Raider GOTY is from 2013 it still absolutely destroys my power limits and offset OC. 
In a modern game like BF5 / BF2042 or even Cyberpunk 2077 the card can hit around 2055-2070 @ 1.068v ish and games like The Division 2 or Guardians of the Galaxy run a bit lower, more like 1995-2010 @ 1.012v ish at the limiter (340-345w on this 2x8 pin model). 
However, Tomb Raider will barely even hit 1920Mhz and drops the voltage all the way to 0.968v and even then drops under 1900Mhz regularly. I mean, the game still runs at like 260FPS on all max 1080p but still. 

I had to basically put my curve back on that runs 2010 @ 0.975 to make it boost a bit higher lol.


----------



## Panchovix

Imprezzion said:


> Well, that was interesting. With the Tomb Raider trilogy having been free on Epic since Christmas I decided to play those games again and even tho Tomb Raider GOTY is from 2013 it still absolutely destroys my power limits and offset OC.
> In a modern game like BF5 / BF2042 or even Cyberpunk 2077 the card can hit around 2055-2070 @ 1.068v ish and games like The Division 2 or Guardians of the Galaxy run a bit lower, more like 1995-2010 @ 1.012v ish at the limiter (340-345w on this 2x8 pin model).
> However, Tomb Raider will barely even hit 1920Mhz and drops the voltage all the way to 0.968v and even then drops under 1900Mhz regularly. I mean, the game still runs at like 260FPS on all max 1080p but still.
> 
> I had to basically put my curve back on that runs 2010 @ 0.975 to make it boost a bit higher lol.


That's pretty interesting, also got it, gonna test how it goes tomorrow maybe.
On my 3080 I got power limit at 490W~ only on Vanguard at 4K, everything else uses less in my case haha


----------



## OffBeatViBE

I have 2x8 pin Gigabyte Eagle RTX 3080 LHR with bios from the gaming OC model with power limit of around 350W and I managed to get my card undervolted to 0.968 and I'm able to maintain fairly stable 2025-2040 Core(of course under benchmarking I'm hitting the power limits) speeds with +700 on the memory (not sure if I can go higher). I've noticed that right after +1000 on the memory, my performance drops, but I'm not sure if it's worth it to the push it to that edge because it also affects my overall power limits and performance gain seem minimal...


----------



## SPL Tech

OffBeatViBE said:


> I have 2x8 pin Gigabyte Eagle RTX 3080 LHR with bios from the gaming OC model with power limit of around 350W and I managed to get my card undervolted to 0.968 and I'm able to maintain fairly stable 2025-2040 Core(of course under benchmarking I'm hitting the power limits) speeds with +700 on the memory (not sure if I can go higher). I've noticed that right after +1000 on the memory, my performance drops, but I'm not sure if it's worth it to the push it to that edge because it also affects my overall power limits and performance gain seem minimal...


GDDR6X is EEC memory so it wont crash, it just drops in performance when you OC it too much. Around +1000 is what most people get.


----------



## Imprezzion

Is a 3080 and 3080 Ti PCB generally the same? I have a Gigabyte 3080 Gaming OC non-LHR with a waterblock on it but I don't have the stock cooler anymore. I have the box and receipt and everything else but the cooler had broken fans so I chucked it.

I kinda wanna buy a 3080 Ti Gaming OC, slap my block on it and slap that cooler on my 3080 so I can sell that as a complete card again. I can find PCB detail photos on Google obviously but maybe one of you guys had both cards or knows a bit more if they are the same or if there's subtle differences.

Edit: nevermind. It's actually much cheaper to just get a 3080 Ti FE for like 1450 and buy a new block for that and just sell my 3080 Gigabyte with the block. It's non LHR so should fetch almost the same a Ti FE would cost me.


----------



## Panchovix

So we have a new card into the family here









NVIDIA introduces GeForce RTX 3080 12GB with 8960 CUDA cores and 350W TDP - VideoCardz.com


NVIDIA announces new RTX 3080 with 12GB memory The company adds a new SKU to its long list of high-end models. The new RTX 3080 is now official. The company launches the updated model of the existing GA102-200 based SKU more than a year after introducing the original 10GB version. What was...




videocardz.com





3080 12GB, 350W TDP

Wondering the price at "MSRP", EVGA already has some at 1249USD I think


----------



## mouacyk

Totally not worth the hassle of breaking down a loop, much less trying to get one and the appropriate block and pay the new MSRP. However, DLDSR this Friday will be interesting. Really curious how they are managing this without motion vectors and game-specific training. It's nice to get Image Sharpening back in the Control Panel separate from NIS. Also, AV1 hardware decoding support should be nice.


----------



## Panchovix

Man DLDSR looks amazing, I will finally update the driver from 472.12 to that on Friday.

And yeah, IMO NVIDIA did release this to not sell the 3080 10GB at 700USD anymore


----------



## mouacyk

Panchovix said:


> Man DLDSR looks amazing, I will finally update the driver from 472.12 to that on Friday.
> 
> And yeah, IMO NVIDIA did release this to not sell the 3080 10GB at 700USD anymore


I am looking forward to combining DLDSR + Image Sharpening to get even more of an advantage in CODMW2019. Currently I use DLSS (Quality) + Image Sharpening, but the softening then sharpening makes too many edges glow and end up being distracting.

The real reason NVidia is coming up with all these SKUs is because the demand for GPUs have never been so high, that they can afford the marketing cost to sell each silicon bin as is without fusing or disabling capabilities to mass-consolidate for volume sales. As result, expect these new SKUs have even worse availability. (Otherwise, they would have marketed them as the higher bin at Ampere's launch?) They didn't even bother to open up the Power Limit... so, you're literally paying for nothing because at 370W the 3080 10GB is already severely power limited.


----------



## Panchovix

Wondering if the XC3 has better voltage "control" against the TUF VBIOS

First pic is EVGA XC3 info with ABE










Second one is the TUF one










It seems some values are higher with one VBIOS, and lower with the other one; wondering which could be better for my shunted 3080, since I've read (I think like 1 year ago lol) that with the EVGA XC3 you may reach slightly better clocks given any voltage vs other VBIOS.


----------



## fray_bentos

mouacyk said:


> I am looking forward to combining DLDSR + Image Sharpening to get even more of an advantage in CODMW2019. Currently I use DLSS (Quality) + Image Sharpening, but the softening then sharpening makes too many edges glow and end up being distracting.
> 
> The real reason NVidia is coming up with all these SKUs is because the demand for GPUs have never been so high, that they can afford the marketing cost to sell each silicon bin as is without fusing or disabling capabilities to mass-consolidate for volume sales. As result, expect these new SKUs have even worse availability. (Otherwise, they would have marketed them as the higher bin at Ampere's launch?) They didn't even bother to open up the Power Limit... so, you're literally paying for nothing because at 370W the 3080 10GB is already severely power limited.


I wouldn't describe the 3080 as "power limited" when the 100 W change from 220 W (undervolt) to 320 W (stock PL) gives <10% change in performance. That's "efficiency limited", not "power limited". I don't understand the appeal of running a 3080 at even at the stock power limit, never mind running anything above that. Some people around here are getting ~15% gains by doubling the power consumption; the heat, noise, and risk to hardware failure is simply not worth it.


----------



## acoustic

fray_bentos said:


> I wouldn't describe the 3080 as "power limited" when the 100 W change from 220 W (undervolt) to 320 W (stock PL) gives <10% change in performance. That's "efficiency limited", not "power limited". I don't understand the appeal of running a 3080 at even at the stock power limit, never mind running anything above that. Some people around here are getting ~15% gains by doubling the power consumption; the heat, noise, and risk to hardware failure is simply not worth it.


Maybe at 1440p .. but at 3840x1600, or at 4K, I need every frame I can get. Efficiency goes out the window when you need the framerate.


----------



## fray_bentos

acoustic said:


> Maybe at 1440p .. but at 3840x1600, or at 4K, I need every frame I can get. Efficiency goes out the window when you need the framerate.


This is precisely why I am "still" on 1440p; GPUs simply still aren't quite there for 4K gaming at the framerates I target (100 fps minimum). Where I have performance headroom I enable RTGI, DLDSR (or DSR) to improve quality. DLDSR is great.


----------



## Caffinator

What are your thoughts on vram? I am considering upgrade to 3090 because 10G is too little?


----------



## Panchovix

Caffinator said:


> What are your thoughts on vram? I am considering upgrade to 3090 because 10G is too little?


Haven't found VRAM limitations on my case, but I do play at 1440p 144Hz, I guess at 4k the VRAM it may be an issue


----------



## Imprezzion

Hell, even at 1080p 10GB isn't enough sometimes. Far Cry 6 for example does not run on 10GB 3080 with high res texture pack enabled. Half the textures won't load or get replaced with low quality ones and the game constantly complains about too low VRAM and it just sits at 10GB the entire time. Horizon 5 has the same issue. Low textured road surfaces and bushes from time to time with all settings on max as it just doesn't have enough VRAM.


----------



## zebra_hun

Imprezzion said:


> Hell, even at 1080p 10GB isn't enough sometimes. Far Cry 6 for example does not run on 10GB 3080 with high res texture pack enabled. Half the textures won't load or get replaced with low quality ones and the game constantly complains about too low VRAM and it just sits at 10GB the entire time. Horizon 5 has the same issue. Low textured road surfaces and bushes from time to time with all settings on max as it just doesn't have enough VRAM.


Forza Horizon 5 need 7.5GBRam on Extreme Preset. I think, a RTX3080 enough for FH5. 2560*1440
Here is my daily Allcore OC 5GHz result:









Here is MSI AB Log:










It may not be true what I write, but I see it.


----------



## acoustic

fray_bentos said:


> This is precisely why I am "still" on 1440p; GPUs simply still aren't quite there for 4K gaming at the framerates I target (100 fps minimum). Where I have performance headroom I enable RTGI, DLDSR (or DSR) to improve quality. DLDSR is great.


Yep. The 3080TI handles 3840x1600 pretty well, but still craving the next flagship card for more powah.


----------



## Panchovix

Imprezzion said:


> Hell, even at 1080p 10GB isn't enough sometimes. Far Cry 6 for example does not run on 10GB 3080 with high res texture pack enabled. Half the textures won't load or get replaced with low quality ones and the game constantly complains about too low VRAM and it just sits at 10GB the entire time. Horizon 5 has the same issue. Low textured road surfaces and bushes from time to time with all settings on max as it just doesn't have enough VRAM.


I think in only those games it has VRAM issues, mostly FC6, but even a 12GB VRAM card would suffer as well, the memory management of textures in that game is pretty bad lol.

If I was worried about VRAM though, I would jump to a 6800XT/6900XT (for 16GB VRAM) or a 3090 (24 GB VRAM), 3080Ti with 12GB is not worth IMO, specially if it's like $1000 more expensive, and it would suffer the same VRAM issues as the RTX 3080; and if you were to use too much money, better to go best of the best.

(Though I still think, in my case at least, I will jump to a RTX 4080/7800XT or whatever they call it before the VRAM is an issue on a 3080)


----------



## Broooo

Hi guys, I'm triyng to shunt mod my card (3080 Suprim X) but I can't.
I don't know where I'm doing it wrong because I shunted all 6 resistors (5 in the 8 pins area and 1 for the PCI-E) with the same 5 mOhms but nothing changes.
Initially I thought I was wrong using hot glue but I used a multimeter to check the connections and they are all ok…
I really don't know what to do, can anyone help me? Thank you!


----------



## mouacyk

Broooo said:


> Hi guys, I'm triyng to shunt mod my card (3080 Suprim X) but I can't.
> I don't know where I'm doing it wrong because I shunted all 6 resistors (5 in the 8 pins area and 1 for the PCI-E) with the same 5 mOhms but nothing changes.
> Initially I thought I was wrong using hot glue but I used a multimeter to check the connections and they are all ok…
> I really don't know what to do, can anyone help me? Thank you!


The suprim x has 3x pins. There are 500watt bioses.

Did you scrape the shunt solder?


----------



## Broooo

mouacyk said:


> The suprim x has 3x pins. There are 500watt bioses.
> 
> Did you scrape the shunt solder?


Are there 500W bioses?! Where I can find them?

What do you mean with "scrape"? I simply put another shunt resistor (5 mOhm) on top of the factory one and glued together with hot glue, totally reversible with no damage or solder at all.


----------



## mouacyk

Broooo said:


> Are there 500W bioses?! Where I can find them?
> 
> What do you mean with "scrape"? I simply put another shunt resistor (5 mOhm) on top of the factory one and glued together with hot glue, totally reversible with no damage or solder at all.


You can pretty much flash any 3xpin BIOS to any 3xpin GPU, at the cost of losing a random output port. If you ask around, someone's bound to have the EVGA XOC 500W BIOS. Otherwise, you can browse through VGA Bios Collection | TechPowerUp.

When you stack shunt resistors, it's important to scrape the solder to remove the conformal coating, otherwise there is no or very limited electrical contact. Also, you will get little help with the hot glue method here, since it's quite unreliable. Others have done metal painting and that has enough problems of its own. Only soldering is truly reliable.


----------



## Broooo

mouacyk said:


> You can pretty much flash any 3xpin BIOS to any 3xpin GPU, at the cost of losing a random output port. If you ask around, someone's bound to have the EVGA XOC 500W BIOS. Otherwise, you can browse through VGA Bios Collection | TechPowerUp.
> 
> When you stack shunt resistors, it's important to scrape the solder to remove the conformal coating, otherwise there is no or very limited electrical contact. Also, you will get little help with the hot glue method here, since it's quite unreliable. Others have done metal painting and that has enough problems of its own. Only soldering is truly reliable.


Thanks for your reply, I was searching this 500 W bios since I read of it 9 months ago but I never found it (it seems to me a legend or so…). If you know where to download it can you please send a link?

Ok so probably I’ll solder the resistors… I have to scrape the surface even if I solder them?
And another question: Have I to shunt all 6 resistors as I done before, right?


----------



## mouacyk

If you solder, I believe the conformal coating evaporates when you melt the existing solder.


----------



## Broooo

mouacyk said:


> If you solder, I believe the conformal coating evaporates when you melt the existing solder.


Ok, thanks again, nothing regarding the 500 W bios?


----------



## mouacyk

450W BIOS: VGA Bios Collection: EVGA RTX 3080 10 GB | TechPowerUp
500W BIOS: (Link to file near last comments) New XOC BIOS for 3080 FTW3 ULTRA available now (Jacob Freeman on Twitter) : nvidia (reddit.com)
EDIT: There is no 500W BIOS for the 3080, only for the 3090. EVGA GeForce RTX 3080 FTW3 XOC BIOS (Page 20) - EVGA Forums


----------



## Broooo

mouacyk said:


> 450W BIOS: VGA Bios Collection: EVGA RTX 3080 10 GB | TechPowerUp
> 500W BIOS: (Link to file near last comments) New XOC BIOS for 3080 FTW3 ULTRA available now (Jacob Freeman on Twitter) : nvidia (reddit.com)


I double checked all comments but I couldn't find the file… I'm probably blind hahaha


mouacyk said:


> EDIT: There is no 500W BIOS for the 3080, only for the 3090. EVGA GeForce RTX 3080 FTW3 XOC BIOS (Page 20) - EVGA Forums


Oh ok, thanks anyway!


----------



## olllian

Hey guys. Can someone help me out or is anyone ells having lagging going on when playing warzone vanguard. I have not touched my 3d settings in game or nvidia control panel. All of a sudden last week my gpu Utilization goes to 2 % and vram max and I can barely move my guy in game it skips so bad when trying to do anything. I have tried GeForce experience witch makes it worse. I don't run max grafic setting at all. I am getting 100+ fps and 160 with dlss on. Thx


----------



## Broooo

Guys do I have to shunt this resistor too?


----------



## acoustic

olllian said:


> Hey guys. Can someone help me out or is anyone ells having lagging going on when playing warzone vanguard. I have not touched my 3d settings in game or nvidia control panel. All of a sudden last week my gpu Utilization goes to 2 % and vram max and I can barely move my guy in game it skips so bad when trying to do anything. I have tried GeForce experience witch makes it worse. I don't run max grafic setting at all. I am getting 100+ fps and 160 with dlss on. Thx


Reinstall drivers and use DDU for the uninstall. If it happened out of nowhere, could have malware on the PC as well - fresh install of Windows never hurt anybody.


----------



## Falkentyne

Broooo said:


> Guys do I have to shunt this resistor too?
> View attachment 2546450


I'm not completely sure what shunt that is, but if that's the SRC (power plane input power source) shunt, it absolutely _MUST_ be modded or you won't get a single amount of extra power from that card.

I know at least one card is confirmed that's the SRC shunt.










Post a full HWinfo64 screenshot of the expanded power wattage values while running "Heaven" Benchmark, with all values expanded (make sure your hwinfo64 is up to date). I'm not sure what that shunt controls on your exact card. Some cards have a 2512 (normal sized) shunt there, while some have a 1206 sized shunt (those can be found easily on mouser). Note that 1W shunts are the easiest to mod as the metal conductive edges are the same height as the middle part, making solder stacking very easy. 2W shunts are aggravating to mod as the important parts (the edges) are lower than the middle part, meaning you have to do a solder bridge to 'raise' the edges so it connects the new shunt, and that's difficult without a good (at least 65W) iron.

And please don't use hot glue. Hot glue is garbage. Use solder and don't be lazy. Irons aren't expensive and if you have a 3080, you can afford an iron.
TS100 iron is a reasonable starter iron that's powerful enough for heating up the work area so the solder can flow and to melt the conformal coating.
(you should also get Kester 60/40 solder wire, and some decent quality rosin flux, and some brass coils to clean the iron tip. Protip: Buy some 3M high temp polyimide tape and tape around the shunts on the PCB, leaving only the shunt itself exposed--you will be VERY glad you did--it can save your butt big time if you have a solder accident).


----------



## Broooo

Falkentyne said:


> I'm not completely sure what shunt that is, but if that's the SRC (power plane input power source) shunt, it absolutely _MUST_ be modded or you won't get a single amount of extra power from that card.
> 
> I know at least one card is confirmed that's the SRC shunt.
> 
> View attachment 2546463
> 
> 
> Post a full HWinfo64 screenshot of the expanded power wattage values while running "Heaven" Benchmark, with all values expanded (make sure your hwinfo64 is up to date). I'm not sure what that shunt controls on your exact card. Some cards have a 2512 (normal sized) shunt there, while some have a 1206 sized shunt (those can be found easily on mouser). Note that 1W shunts are the easiest to mod as the metal conductive edges are the same height as the middle part, making solder stacking very easy. 2W shunts are aggravating to mod as the important parts (the edges) are lower than the middle part, meaning you have to do a solder bridge to 'raise' the edges so it connects the new shunt, and that's difficult without a good (at least 65W) iron.
> 
> And please don't use hot glue. Hot glue is garbage. Use solder and don't be lazy. Irons aren't expensive and if you have a 3080, you can afford an iron.
> TS100 iron is a reasonable starter iron that's powerful enough for heating up the work area so the solder can flow and to melt the conformal coating.
> (you should also get Kester 60/40 solder wire, and some decent quality rosin flux, and some brass coils to clean the iron tip. Protip: Buy some 3M high temp polyimide tape and tape around the shunts on the PCB, leaving only the shunt itself exposed--you will be VERY glad you did--it can save your butt big time if you have a solder accident).


Thank you very much for your reply, it's really apreciated.

I cannot post the HWInfo screenshot right know because I have the 3080 unmounted form the PC.

So by a few minutes I already shunted the PCI-E slot, the 3 plugs, the VMem and the GPU Chip resistors with soldering method, all I have to shunt now it's the SRC resistor (if you give me the ok) but I only have the 2512 1W 5 mOhms resistors, can I use a wire to connect one of my 2512 resistor to the 1206 on the card? I already seen something like that in the past…

I'm not going to buy on mouser because I live in Italy and for a single 0.52$ resistor I have to pay 22.61$ of international shipping…

EDIT: I used hot glue previously for warranty reasons but I don't care anymore…
Although I think I did it correctly even with hot glue but it didn't work just because I hadn't shunted the SRC resistor.


----------



## Panchovix

I think hot glue doesn't work much on shunting the ampere cards, so I did solder, though if you have good skill you can de solder easily.

I have the 3080 TUF, and the shunt mod worked (used 8mOhm on top of the default ones, so 1.62x the power), but had to shunt all of these (this is the TUF PCB)










No SRC shunt explicit there, but maybe one of these does that lol


----------



## Falkentyne

Broooo said:


> Thank you very much for your reply, it's really apreciated.
> 
> I cannot post the HWInfo screenshot right know because I have the 3080 unmounted form the PC.
> 
> So by a few minutes I already shunted the PCI-E slot, the 3 plugs, the VMem and the GPU Chip resistors with soldering method, all I have to shunt now it's the SRC resistor (if you give me the ok) but I only have the 2512 1W 5 mOhms resistors, can I use a wire to connect one of my 2512 resistor to the 1206 on the card? I already seen something like that in the past…
> 
> I'm not going to buy on mouser because I live in Italy and for a single 0.52$ resistor I have to pay 22.61$ of international shipping…
> 
> EDIT: I used hot glue previously for warranty reasons but I don't care anymore…
> Although I think I did it correctly even with hot glue but it didn't work just because I hadn't shunted the SRC resistor.


Here are the 1206 shunts you need.



https://www.mouser.com/ProductDetail/71-WSLP12065L000FEA


Those are out of stock.

Try these. 4 mOhm but will still work fine (dante'afk used 3 mohm on SRC and PCIE slot and 5 mohm on all the others no problem).


https://www.mouser.com/ProductDetail/71-WSLP12064L000FEA


----------



## Broooo

Panchovix said:


> I think hot glue doesn't work much on shunting the ampere cards, so I did solder, though if you have good skill you can de solder easily.
> 
> I have the 3080 TUF, and the shunt mod worked (used 8mOhm on top of the default ones, so 1.62x the power), but had to shunt all of these (this is the TUF PCB)
> 
> View attachment 2546477
> 
> 
> No SRC shunt explicit there, but maybe one of these does that lol


I think that one of the vertical mounted on the right on your card is the SRC.
You shunted 6 resistors in total with 2 8 pins connectors; I have 3 8 pins connectors and I shunted 6 resistors too… so probably I have to sunt the last smaller one.


----------



## Falkentyne

Panchovix said:


> I think hot glue doesn't work much on shunting the ampere cards, so I did solder, though if you have good skill you can de solder easily.
> 
> I have the 3080 TUF, and the shunt mod worked (used 8mOhm on top of the default ones, so 1.62x the power), but had to shunt all of these (this is the TUF PCB)
> 
> View attachment 2546477
> 
> 
> No SRC shunt explicit there, but maybe one of these does that lol


One of those at the top is the SRC one. The isolated bottom left is PCIE Slot.
If I had to bet money and take a guess, I would guess it's the bottom left or bottom right one in the far right cluster. Probably whichever one is closest to the controller chip? You can always take a DMM and try to trace for continuity...


----------



## Broooo

Falkentyne said:


> Here are the 1206 shunts you need.
> 
> 
> 
> https://www.mouser.com/ProductDetail/71-WSLP12065L000FEA
> 
> 
> Those are out of stock.
> 
> Try these. 4 mOhm but will still work fine (dante'afk used 3 mohm on SRC and PCIE slot and 5 mohm on all the others no problem).
> 
> 
> https://www.mouser.com/ProductDetail/71-WSLP12064L000FEA


You probably haven't read what I wrote in the post before, I wouldn't want to buy on mouser due to the high shipping costs.
I asked if it was possible to connect my 2512 to the 1206 on the board with an electrical wire.

Anyway thank you so much for your time guys.


----------



## Panchovix

Broooo said:


> You probably haven't read what I wrote in the post before, I wouldn't want to buy on mouser due to the high shipping costs.
> I asked if it was possible to connect my 2512 to the 1206 on the board with an electrical wire.
> 
> Anyway thank you so much for your time guys.


Since I'm from Chile, I did buy the shunts from aliexpress lol, took like 2 weeks to arrive here.

Mouser shipping cost is way too high outside USA, so I feel you.

I got this one, now it's kinda expensive tbh lol, I bought for like 5 bucks in 2020.









16.99US $ |Erjm1wsf8m0u Current Sense Resistors - Smd 2512 0.008ohm 1% New Original - Remote Control - AliExpress


Smarter Shopping, Better Living! Aliexpress.com




aliexpress.com


----------



## Broooo

Panchovix said:


> Since I'm from Chile, I did buy the shunts from aliexpress lol, took like 2 weeks to arrive here.
> 
> Mouser shipping cost is way too high outside USA, so I feel you.
> 
> I got this one, now it's kinda expensive tbh lol, I bought for like 5 bucks in 2020.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 16.99US $ |Erjm1wsf8m0u Current Sense Resistors - Smd 2512 0.008ohm 1% New Original - Remote Control - AliExpress
> 
> 
> Smarter Shopping, Better Living! Aliexpress.com
> 
> 
> 
> 
> aliexpress.com


I bought mine on eBay but they are the 2512 form factor like yours, I don't have the 1206 form factor shunts so I soldered one of my 2512 on the 1206 on the card via 2 tiny wires and I checked the connection with a multimeter and seems ok.


----------



## Broooo

Guys I did it, with all 7 resistors soldered the card finally shows half the power on screen.

I locked the voltage at 1.1v on Afterburner but even with about 200-205 watts of total power draw I still see "PWR" as "Perf Cap Reason" on GPU-Z; I don't know why but on Riva Tuner I keep seeing 99% TDP but all the values are halved: the 3 plugs, the PCI-E slot… the only one that seems suspicious to me is the "GPU Chip Power Draw" which now reports about 130W and I don't think my chip is eating 260+ W.

What should I do?


----------



## Broooo

Broooo said:


> Guys I did it, with all 7 resistors soldered the card finally shows half the power on screen.
> 
> I locked the voltage at 1.1v on Afterburner but even with about 200-205 watts of total power draw I still see "PWR" as "Perf Cap Reason" on GPU-Z; I don't know why but on Riva Tuner I keep seeing 99% TDP but all the values are halved: the 3 plugs, the PCI-E slot… the only one that seems suspicious to me is the "GPU Chip Power Draw" which now reports about 130W and I don't think my chip is eating 260+ W.
> 
> What should I do?


I just read about "internal limits" that doesn't allow the card to go beyond a certain power.
I managed to get 215W on screen (which should correspond to the 430W of the BIOS), and I don't think these "internal limits" don't make me go beyond 430W, also because with other BIOSes (Strix, XTREME) I got 450W… so there's somerhing else that's limiting the card.
I believe that at 1.1v the card should push up to 600+ W (so 300+ on screen), right?

I really don't know what to do.


----------



## Panchovix

Broooo said:


> I just read about "internal limits" that doesn't allow the card to go beyond a certain power.
> I managed to get 215W on screen (which should correspond to the 430W of the BIOS), and I don't think these "internal limits" don't make me go beyond 430W, also because with other BIOSes (Strix, XTREME) I got 450W… so there's somerhing else that's limiting the card.
> I believe that at 1.1v the card should push up to 600+ W (so 300+ on screen), right?
> 
> I really don't know what to do.


Internal rails seems to limit the 3080 in some cases, on mine it gets limited at about 490-500W at 1.1V (TUF VBIOS) with powerlimit normally, though I do get max power in my case (560W~) on Quake II RTX for example

Honestly, I don't know if there's a way to bypass that  I would get higher scores in TimeSpy Extreme, since I get power limited there at 490W in the 2nd test lol


----------



## Broooo

Panchovix said:


> Internal rails seems to limit the 3080 in some cases, on mine it gets limited at about 490-500W at 1.1V (TUF VBIOS) with powerlimit normally, though I do get max power in my case (560W~) on Quake II RTX for example
> 
> Honestly, I don't know if there's a way to bypass that  I would get higher scores in TimeSpy Extreme, since I get power limited there at 490W in the 2nd test lol


Yes, but if I can get 450W with other BIOSes (without shunt mod) I don’t think the internal rails are limiting me at 430W…
I never heard of an Ampere card so limited, so there’s something else messing with the power.
As I said previously my “Perf Cap Reason” is still “PWR”… like the mod was only effective on the values shown and not on the actual input power.


----------



## Falkentyne

Panchovix said:


> Internal rails seems to limit the 3080 in some cases, on mine it gets limited at about 490-500W at 1.1V (TUF VBIOS) with powerlimit normally, though I do get max power in my case (560W~) on Quake II RTX for example
> 
> Honestly, I don't know if there's a way to bypass that  I would get higher scores in TimeSpy Extreme, since I get power limited there at 490W in the 2nd test lol


Most likely the "power limit" that is exceeded is the calibration which reports to TDP Normalized %, which reads the internal rails. TDP% only reads the total of the 8 pins+PCIE Slot, so TDP% is useless here.

The only way to bypass that limit is to either use an XOC VBios that already has these internal limits removed (Kingpin 1kw bios for example removes these internal rail restrictions), or to mod and increase either MSVDD or NVVDD voltage (as it seems these limits are based on how much voltage is set somehow), and that requires an Elmor hardware soldered tool to do, except on Classified and possibly Galax HOF cards . And it cannot be done on Founder's Edition cards with Elmor's tool, as the controller chip is completely read only with a 16 bit password to unlock write access (if this password is brute-forced and guessed wrong, the chip locks itself out until 3.3v power is cycled again).


----------



## Falkentyne

Broooo said:


> Yes, but if I can get 450W with other BIOSes (without shunt mod) I don’t think the internal rails are limiting me at 430W…
> I never heard of an Ampere card so limited, so there’s something else messing with the power.
> As I said previously my “Perf Cap Reason” is still “PWR”… like the mod was only effective on the values shown and not on the actual input power.


Can you please post a hwinfo screenshot of all of the "fully expanded" power rails after a 3dmark Port Royal, Timespy or Unigine Heaven Benchmark run, and make sure "TDP Normalized" is showing (so do not use GPU-Z, GPU-Z does not show TDP Normalized at all).

If your TDP Normalized reports 100%+, that means an internal rail has been exceeded.


----------



## Broooo

Falkentyne said:


> Can you please post a hwinfo screenshot of all of the "fully expanded" power rails after a 3dmark Port Royal, Timespy or Unigine Heaven Benchmark run, and make sure "TDP Normalized" is showing (so do not use GPU-Z, GPU-Z does not show TDP Normalized at all).
> 
> If your TDP Normalized reports 100%+, that means an internal rail has been exceeded.


Yes, soon I turn on the PC and post the screenshot.
Unfortunately, however, I have already seen the TDP% Normalized has exceeded 100% (approximately 102.3%).

Couldn't it be that maybe I made a mistake in shunt mod one of the resistors? Because when I am limited by the "PWR" my card cuts the voltage down to between 1.025 and 1.081, so there should still be headroom for the MSVDD and NVVDD to increase.
In fact when I throttle down the only "Perf Cap Reason" is "PWR" and not "VRel-VOp".


----------



## Broooo

Falkentyne said:


> Can you please post a hwinfo screenshot of all of the "fully expanded" power rails after a 3dmark Port Royal, Timespy or Unigine Heaven Benchmark run, and make sure "TDP Normalized" is showing (so do not use GPU-Z, GPU-Z does not show TDP Normalized at all).
> 
> If your TDP Normalized reports 100%+, that means an internal rail has been exceeded.


Here is the screenshot. As you can see 103.5%.
EDIT: This is after a full run of Time Spy.


----------



## andrew149

does anyone have favorite bios for the gigabyte gaming oc? my stock bios I could get 100 hash with a power limit of 65% I flashed the gigabyte aorus extreme and I need 70% power and 1500mhz of ram instead of 1400mhz of ram like the stock bios to get 100hash on Eth.


----------



## andrew149

Looking for a Hybrid cooler for a gigabyte gaming does anyone have one or something that has a better heat sink I already replace the pads in mine which made a huge difference but I want to see my memory temps under 105c


----------



## mouacyk

Any idea if the new LHR BIOS v2 Unlocker means that custom BIOS editing is incoming?








NVIDIA RTX LHR BIOS v2 Unlocker is infused with malware - VideoCardz.com


Do not download “Nvidia RTX LHR v2 Unlocker” The software contains malware. It connects to a remote server and is likely designed to steal user data and PC compute power for malicious reasons. As soon as the hacker released the software it was analyzed by the Red Panda Mining team: Do not...




videocardz.com





How is a custom driver any different than the custom signing that NVCleanstall already does, which is even EAC compatible?


----------



## Panchovix

mouacyk said:


> Any idea if the new LHR BIOS v2 Unlocker means that custom BIOS editing is incoming?
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA RTX LHR BIOS v2 Unlocker is infused with malware - VideoCardz.com
> 
> 
> Do not download “Nvidia RTX LHR v2 Unlocker” The software contains malware. It connects to a remote server and is likely designed to steal user data and PC compute power for malicious reasons. As soon as the hacker released the software it was analyzed by the Red Panda Mining team: Do not...
> 
> 
> 
> 
> videocardz.com
> 
> 
> 
> 
> 
> How is a custom driver any different than the custom signing that NVCleanstall already does, which is even EAC compatible?


I really doubt it's real, but if it is, damn that would be amazing lol, not for LHR per se, but because you could maybe edit some VBIOS like in the Kepler/Maxwell era


----------



## andypandy2514

.


----------



## Damie

Ordered Heatkiller V for my XC3 ultra, should I get this backplate too or will my VRAM temps be fine without?






HEATKILLER® V Backplate for Zotac Trinity RTX 3080/3090 - Black, 36,95 €


Watercool Heatkiller V Backplate für die RTX 3080/3090 Grafikkarten aus dem Hause Nvidia. Passend zu unseren Kühler der neuesten Heatkiller V Serie.




shop.watercool.de


----------



## mouacyk

The moddiy 8pin is wrong? just got them from ppc. The top is bykski, been using since Jan 2021, no issues. Is the moddiy pinout even safe? I haven't tried it and won't.









And they don't fking fit on gigabyte rtx 3080! Slightly smaller female pins


----------



## mouacyk

Strangely, the stock gigabyte ones match moddiy's and I did test the stock initially. Perhaps there are two standards?

Well, I shortened both the Bykski cables and everything's working fine and went direct from card to PSU. After 2 hours of grueling soldering shunts for the first time, then testing all contact points with an MM, my RTX is pulling 440W in TimeSpy now!


----------



## mouacyk

Shunted benching with cold air. I guess I should have gone for 10mOhm or 5mOhm resistors instead of 15mOhm.
Hooked up my external GTX Nemesis 360mm rad with window open:


----------



## Panchovix

mouacyk said:


> Shunted benching with cold air. I guess I should have gone for 10mOhm or 5mOhm resistors instead of 15mOhm.
> Hooked up my external GTX Nemesis 360mm rad with window open:
> View attachment 2552154


2175Mhz is pretty good though, if you can maintain that on games, you're sold.

15mOhm is kinda "high" but at least, you`re 100% safe on all cases, and honestly the difference would be 50W tops maybe (since it would, maybe, get limited by internal rails first)


----------



## mouacyk

Panchovix said:


> 2175Mhz is pretty good though, if you can maintain that on games, you're sold.
> 
> 15mOhm is kinda "high" but at least, you`re 100% safe on all cases, and honestly the difference would be 50W tops maybe (since it would, maybe, get limited by internal rails first)


Based on @bmgjet's calculator, I wanted to make sure to stay under 100W for the PCIe slot and not blow the fuse. I'm using a waterforce BIOS now, and I've only seen up 80W (60*1.33) on it. But I guess it's possible that another BIOS might balance differently and cause the PCIe slot to draw up to 100W? Anyway, really glad how it turned out, first time soldering resistors and making custom short PCI-e cables. Also lapped the 9900K die a bit further, so seeing max core delta of 9C now instead of 11C and dropped 4C max core temp.


----------



## mouacyk

One of my pin shunts failed. It reads 153W on Pin1 and 126W on Pin2. I will have to take it apart and redo them at some point, with proper use of flux


----------



## yzonker

mouacyk said:


> Based on @bmgjet's calculator, I wanted to make sure to stay under 100W for the PCIe slot and not blow the fuse. I'm using a waterforce BIOS now, and I've only seen up 80W (60*1.33) on it. But I guess it's possible that another BIOS might balance differently and cause the PCIe slot to draw up to 100W? Anyway, really glad how it turned out, first time soldering resistors and making custom short PCI-e cables. Also lapped the 9900K die a bit further, so seeing max core delta of 9C now instead of 11C and dropped 4C max core temp.
> 
> View attachment 2552422


A bios won't balance differently as it does not control the power balance. Only the limits.


----------



## Audioboxer

Just got my EVGA 3080 FTW3 Ultra after being in the queue for like 1.5 years lol. It is of course the LHR version. So first thing I did was download the LHR XOC 450w BIOS from the forums and install it. Never had a card with a BIOS switch before, so I used the normal exe first, rebooted, MSI afterburner was showing up to 118%. Switched to OC, rebooted, installed the OC exe and rebooted again. Showing up to 118%. That might not have been necessary to flash twice.

Anyway, add me to the camp that's a bit suspect on the ability for this 3080 to pull anything near 450w










Not even hitting 400w.

EVGA CS say at 3440x1440 I shouldn't be worried and the score is fine.










Highest I've got it is 404.4w, this was whilst playing Ori and the Will of the Wisps and pushing DSR to 2.25x.










Time Spy Extreme, and finally I just ran a Port Royal I scored 12 866 in Port Royal

I don't think my scores are bad or anything, but with being watercooled, on a 450w BIOS and now looking at some people's scores in here, I'm beginning to think all the posts about EVGA FTW3 power draw jank could have some truth in them?

Sadly Time Spy starts crashing with Core Clock at +165, so seems I lost the silicon lottery anyway. Though I don't know if using the curve might help here. On my 2080Ti I had to manually use the voltage curve to set 1.093v at 2100 for it to be stable.

*edit* - In saying that, Port Royal squeaked through at 165 on core I scored 12 942 in Port Royal Slight increase in score.

*edit2* - +190 on core I scored 12 984 in Port Royal I'm guessing Port Royal is "easier" to pass on what might be an unstable core clock in other benchmarks?

*edit3* - +200 on core had a crash. Dropped memory to 900, ran again I scored 13 022 in Port Royal So yeah, it seems Port Royal is likely able to let unstable OCs through if they hold on long enough 

Edit4 - Ran another Time Spy I scored 19 207 in Time Spy

Max draw, 400w


----------



## Panchovix

@Audioboxer It's pretty interesting the VBIOS limits you at 400W even with a 450W VBIOS, that card has 3x8 pin so it shouldn't have issues for pulling that.

If you see higher scores out there is probably because shunt modding, I mean, I did that on my TUF 3080 (because it has 2x8 pins) and I have scores a little higher (20k graphcis score on TimeSpy, 13.3k on Port Royal) using about 450-480W, so you aren't much losing that for not getting those extra 80W either.

The only thing that gets me intrigued is that you're using water, IMO you may be able to get better scores (I got mine on air)



mouacyk said:


> Based on @bmgjet's calculator, I wanted to make sure to stay under 100W for the PCIe slot and not blow the fuse. I'm using a waterforce BIOS now, and I've only seen up 80W (60*1.33) on it. But I guess it's possible that another BIOS might balance differently and cause the PCIe slot to draw up to 100W? Anyway, really glad how it turned out, first time soldering resistors and making custom short PCI-e cables. Also lapped the 9900K die a bit further, so seeing max core delta of 9C now instead of 11C and dropped 4C max core temp.
> 
> View attachment 2552422


With 15mOhm you're more than safe I would say, on my case, with 8mOhm shunts, the max draw I have seen from the PCI-E is 90W, meanwhile from each 8-Pin (The TUF has 2x8) I've got about 220-230W or so, no issues in almost like 8 months?




mouacyk said:


> View attachment 2552548
> 
> One of my pin shunts failed. It reads 153W on Pin1 and 126W on Pin2. I will have to take it apart and redo them at some point, with proper use of flux


I did solder the resistances with proper use of flux and still working so far (on my case still pretty balanced at 150*1.62W (near 220-230W)), BTW is that 22Ghz on the VRAM? That's a pretty beefy VRAM OC lol


----------



## Audioboxer

Panchovix said:


> @Audioboxer It's pretty interesting the VBIOS limits you at 400W even with a 450W VBIOS, that card has 3x8 pin so it shouldn't have issues for pulling that.
> 
> If you see higher scores out there is probably because shunt modding, I mean, I did that on my TUF 3080 (because it has 2x8 pins) and I have scores a little higher (20k graphcis score on TimeSpy, 13.3k on Port Royal) using about 450-480W, so you aren't much losing that for not getting those extra 80W either.
> 
> The only thing that gets me intrigued is that you're using water, IMO you may be able to get better scores (I got mine on air)
> 
> 
> With 15mOhm you're more than safe I would say, on my case, with 8mOhm shunts, the max draw I have seen from the PCI-E is 90W, meanwhile from each 8-Pin (The TUF has 2x8) I've got about 220-230W or so, no issues in almost like 8 months?
> 
> 
> 
> 
> I did solder the resistances with proper use of flux and still working so far (on my case still pretty balanced at 150*1.62W (near 220-230W)), BTW is that 22Ghz on the VRAM? That's a pretty beefy VRAM OC lol












BIOS definitely says it should be able to do 450w. Highest I've seen it though is 404w.

Using a HX1000i and as for power cables, it's the Corsair official individual sleeved. So not cheap brand cables or anything.


----------



## Audioboxer

Ooft, these cards are pretty respectable undervolters, 1935 @ 0.825v I scored 18 052 in Time Spy

Still +1000 on memory.


----------



## Audioboxer

Ran two of these I scored 1 in Time Spy Stress Test

Played games for a few hours.










Curve 0.9v at 2025. Consistently runs 2010mhz in Time Spy.

Seems really decent. Even although I can cool fine up to 400w, this drawing around 300~310w max just seems really efficient. For 95% of games no real need to push any more at 1440p.

I guess Flight Simulator 2020 pre-DLSS when you're on the edge of your seat clawing for every FPS doing 21xx on core and 400w will help.

Will try higher but I'd be really surprised if 0.9v can go much more.

*edit* - This is as high as it goes at 0.9v, but still really happy with that. Ran another Time Spy I scored 1 in Time Spy Stress Test Will continue playing more games over the next few days.


----------



## Panchovix

Audioboxer said:


> Ran two of these I scored 1 in Time Spy Stress Test
> 
> Played games for a few hours.
> 
> View attachment 2552797
> 
> 
> Curve 0.9v at 2025. Consistently runs 2010mhz in Time Spy.
> 
> Seems really decent. Even although I can cool fine up to 400w, this drawing around 300~310w max just seems really efficient. For 95% of games no real need to push any more at 1440p.
> 
> I guess Flight Simulator 2020 pre-DLSS when you're on the edge of your seat clawing for every FPS doing 21xx on core and 400w will help.
> 
> Will try higher but I'd be really surprised if 0.9v can go much more.
> 
> *edit* - This is as high as it goes at 0.9v, but still really happy with that. Ran another Time Spy I scored 1 in Time Spy Stress Test Will continue playing more games over the next few days.


2025Mhz at 0.9V is pretty good actually, I wish I could do that lol, I do like 1905 at 0.9V, even maybe less, so pretty good undervolter you got there, congratz


----------



## Audioboxer

Panchovix said:


> 2025Mhz at 0.9V is pretty good actually, I wish I could do that lol, I do like 1905 at 0.9V, even maybe less, so pretty good undervolter you got there, congratz


Not calling it in until I do days of testing. Been reading it can be a real challenge to be 100% certain with big undervolts. Downloading Metro Exodus just now as all advice says the RTing in that can be great for finding instability.


----------



## mouacyk

Audioboxer said:


> Not calling it in until I do days of testing. Been reading it can be a real challenge to be 100% certain with big undervolts. Downloading Metro Exodus just now as all advice says the RTing in that can be great for finding instability.


Yep, Metro is to GPUs what y-cruncher is to CPUs. While I could run 2175+ on most games at 1.1v, needed to reduce to 2100 for Metro. In the past, I used Crysis3 for stabilizing 1080TI and 980TI.


----------



## Audioboxer

mouacyk said:


> Yep, Metro is to GPUs what y-cruncher is to CPUs. While I could run 2175+ on most games at 1.1v, needed to reduce to 2100 for Metro. In the past, I used Crysis3 for stabilizing 1080TI and 980TI.


Done downloading, so will test. I forgot in 3DMark you can actually do the Port Royal on a loop which is at least using RTing.

Even creeping up from 2010 seems to put the foot on the voltage pedal. 2025 @ 0.925v I scored 1 in Port Royal Stress Test

Interesting going from 0.9v to 0.925v even on water got the card temp up about 1~2 degrees. Power draw around 310~320w instead of around 300w, so I guess that makes sense. Someone else has likely long done it, but presume there will be a graph of voltage efficiency versus performance gained.

As in, to push like 2100 you end up with a pretty steep incline in voltage needed and wattage.


----------



## Audioboxer

I now declare Metro the killer of graphics cards overclocks. Wish I hadn't wasted hours doing 3DMark testing and playing other games without RTing when all I had to do was give this a spin! lol

Back to the drawing board, running 2.0 (1995mhz) at 0.925v to start with. Holding up fine so far past into and into game.

It also hammers the power draw with RT on Ultra, so I guess that's what helps it find instability over 3DMark.


----------



## yzonker

Audioboxer said:


> I now declare Metro the killer of graphics cards overclocks. Wish I hadn't wasted hours doing 3DMark testing and playing other games without RTing when all I had to do was give this a spin! lol
> 
> Back to the drawing board, running 2.0 (1995mhz) at 0.925v to start with. Holding up fine so far past into and into game.
> 
> It also hammers the power draw with RT on Ultra, so I guess that's what helps it find instability over 3DMark.


Yea I was doing the same thing the other day. I had a curve that seemed stable in CP2077, GTAV, and RDR2. But crashed within 5 minutes in Metro. Lol. 

Oddly, iRacing is even worse. My stable curve even in Metro crashed instantly in iRacing. It's so undemanding through that I just set the card to stock for it.


----------



## Audioboxer

yzonker said:


> Yea I was doing the same thing the other day. I had a curve that seemed stable in CP2077, GTAV, and RDR2. But crashed within 5 minutes in Metro. Lol.
> 
> Oddly, iRacing is even worse. My stable curve even in Metro crashed instantly in iRacing. It's so undemanding through that I just set the card to stock for it.


Yeah it's definitely good at finding issues. I mean, it even catches stuff the second it loads in lol.

Been running 2010 at 0.925v which drops to 1995mhz in game for over half an hour now. That's probably more in line with average underclocks rather than me thinking it was going to be fine at 0.9v.

Power draw anywhere from 310w inside to 360-370w outside.


----------



## fray_bentos

Audioboxer said:


> Done downloading, so will test. I forgot in 3DMark you can actually do the Port Royal on a loop which is at least using RTing.
> 
> Even creeping up from 2010 seems to put the foot on the voltage pedal. 2025 @ 0.925v I scored 1 in Port Royal Stress Test
> 
> Interesting going from 0.9v to 0.925v even on water got the card temp up about 1~2 degrees. Power draw around 310~320w instead of around 300w, so I guess that makes sense. Someone else has likely long done it, but presume there will be a graph of voltage efficiency versus performance gained.
> 
> As in, to push like 2100 you end up with a pretty steep incline in voltage needed and wattage.


Here's my data, refined and stability tested over 17 months of gameplay. Going from 825 mV to 900 mV nets about 5% performance improvement for 100 W power gain. Really not worth it. I've stayed aircooled in the efficiency sweet spot. I occasionally set 900 mV when I want every last frame.


----------



## Panchovix

Audioboxer said:


> I now declare Metro the killer of graphics cards overclocks. Wish I hadn't wasted hours doing 3DMark testing and playing other games without RTing when all I had to do was give this a spin! lol
> 
> Back to the drawing board, running 2.0 (1995mhz) at 0.925v to start with. Holding up fine so far past into and into game.
> 
> It also hammers the power draw with RT on Ultra, so I guess that's what helps it find instability over 3DMark.


It's nice to know that Metro kinda can test even the subtlest undervolts/overclocks or both, gonna try in one of these days how it goes, I've been using on my 3080 1860Mhz at 0.881V and for now it haven't had issues on modded skyrim, warzone, FH5, etc; but maybe Metro will manage to crash it lol


----------



## Audioboxer

Panchovix said:


> It's nice to know that Metro kinda can test even the subtlest undervolts/overclocks or both, gonna try in one of these days how it goes, I've been using on my 3080 1860Mhz at 0.881V and for now it haven't had issues on modded skyrim, warzone, FH5, etc; but maybe Metro will manage to crash it lol


Definitely, with everything cranked on max and RT on ultra, albeit with DLSS on quality, this thing can crash in seconds with some undervolts lol.


----------



## fray_bentos

Panchovix said:


> It's nice to know that Metro kinda can test even the subtlest undervolts/overclocks or both, gonna try in one of these days how it goes, I've been using on my 3080 1860Mhz at 0.881V and for now it haven't had issues on modded skyrim, warzone, FH5, etc; but maybe Metro will manage to crash it lol


So basically, you decided that that shunt modding wasn't worth it and you decided to undervolt instead? Your 1860 MHz at 881 mV is super close to what I found myself at the same voltage (1875 MHz).


----------



## mouacyk

fray_bentos said:


> So basically, you decided that that shunt modding wasn't worth it and you decided to undervolt instead? Your 1860 MHz at 881 mV is super close to what I found myself at the same voltage (1875 MHz).


Is that supposed to be an jab on shunt modding? I think even on very cold days with windows open, @Panchovix can still do some very good overclocked benching runs. That's half the fun, lol.


----------



## fray_bentos

mouacyk said:


> Is that supposed to be an jab on shunt modding? I think even on very cold days with windows open, @Panchovix can still do some very good overclocked benching runs. That's half the fun, lol.


Voiding warranty and risking destroying a hard-to-find, expensive GPU for the sake of running a few benches. Hard. Pass.


----------



## Panchovix

fray_bentos said:


> So basically, you decided that that shunt modding wasn't worth it and you decided to undervolt instead? Your 1860 MHz at 881 mV is super close to what I found myself at the same voltage (1875 MHz).


I do undervolt while gaming, and overclock for benching and some 4K games (when the power is needed); you will destroy the card if you don't know what you're doing though, specially if you don't know soldering or such.

If you know, you will be fine; also, the card will use just what it needs on power, even if it's shunt modded.

Half the fun of getting a graphics card is overclocking and benching, I mean, we're in the overclock forum lol.

EDIT: Also, here in Chile the warranty only lasts for 3 months, kinda sucks lol, after my warranty expired, I immediately shunt modded (Like I did with my 2070S, 1660S, etc)


----------



## TK421

what's the highest power limit bios I can download for 2x8 pin RTX 3080 msi ventus x3 model non LHR?


----------



## Audioboxer

So after playing Metro Exodus (good game!) for hours last night and continuing to run some more Port Royal Stress loops back to back (something that uses RTing), I'm confident to say 0.925v at 2010mhz is stable I scored 1 in Port Royal Stress Test

Pretty reasonable! At 1440p I think this is a good undervolt and performance above this frequency is likely going to be 3~10FPS at most with some titles. Though when it comes to some RTing heavy games that is when allowing the juice to flow and pulling around or above 400w might be more of a necessity on something like a 3080.

I definitely recommend Metro Exodus EE for stress testing!


----------



## TK421

Audioboxer said:


> So after playing Metro Exodus (good game!) for hours last night and continuing to run some more Port Royal Stress loops back to back (something that uses RTing), I'm confident to say 0.925v at 2010mhz is stable I scored 1 in Port Royal Stress Test
> 
> Pretty reasonable! At 1440p I think this is a good undervolt and performance above this frequency is likely going to be 3~10FPS at most with some titles. Though when it comes to some RTing heavy games that is when allowing the juice to flow and pulling around or above 400w might be more of a necessity on something like a 3080.
> 
> I definitely recommend Metro Exodus EE for stress testing!


which area did you use for testing?


----------



## Audioboxer

TK421 said:


> which area did you use for testing?


Anywhere lol. I just started playing the game for the first time last night (had it in my Steam library since Winter sale) and I could produce crashes minutes in. So, the intro section.

It seems just having RTing on and cranking the details up and this game is great at finding issues. I'm guessing it's the combination of RTing and the power draw the game uses. It draws quite a bit more power than Port Royal does for me.


----------



## TK421

Audioboxer said:


> Anywhere lol. I just started playing the game for the first time last night (had it in my Steam library since Winter sale) and I could produce crashes minutes in. So, the intro section.
> 
> It seems just having RTing on and cranking the details up and this game is great at finding issues. I'm guessing it's the combination of RTing and the power draw the game uses. It draws quite a bit more power than Port Royal does for me.


is your card 2x8 pin or 3x8 pin?

I'm looking for the highest tdp vbios I can flash into the msi 3080 ventus which has only 2x8pin.


----------



## Audioboxer

TK421 said:


> is your card 2x8 pin or 3x8 pin?
> 
> I'm looking for the highest tdp vbios I can flash into the msi 3080 ventus which has only 2x8pin.


3 pin. Even with a 450w bios max I've seen it go to so far is 404w. Though I've now capped at 0.925v and Metro "only" pulls around 350-370w max.

When I had a 2 pin 2080Ti I had a 373w bios. I think on 2 pin something around 380w is really your theoretical limit.


----------



## TK421

Audioboxer said:


> 3 pin. Even with a 450w bios max I've seen it go to so far is 404w. Though I've now capped at 0.925v and Metro "only" pulls around 350-370w max.
> 
> When I had a 2 pin 2080Ti I had a 373w bios. I think on 2 pin something around 380w is really your theoretical limit.


3pin is which card?

try 3dmark timespy extreme graphics test 2 (only), you can see some absurd power draws in that test


----------



## Audioboxer

TK421 said:


> 3pin is which card?
> 
> try 3dmark timespy extreme graphics test 2 (only), you can see some absurd power draws in that test


EVGA have 3 pin


----------



## Panchovix

TK421 said:


> is your card 2x8 pin or 3x8 pin?
> 
> I'm looking for the highest tdp vbios I can flash into the msi 3080 ventus which has only 2x8pin.


Max on 2x8 pin is like 350W via VBIOS (even if some on them say 366 or 375W, the max will be 350W), that's why I shunted my TUF (to use about 450-500W).

3x8 pin have the 450W VBIOS option.


----------



## tcclaviger

In the same family, 3080ti 2x8pin. Pulled around 550-600 watts today, can't get an exact number for many reasons. No I'll effects. 15631 PR, hold all 5950x + 3080ti #1 spots on 3dmark, and almost all 3080ti on superposition.

Did the same with my 2080ti and 1080ti, pushed hard at unlimited power for years, both still work just fine with no degradation that I can spot.

Just use some common sense. GPU over 60c on air, probably don't go and pull 500 watts...

Retail 3080ti Gaming OC, stock bios + shunts for gaming. HOF bios for benchmarking.

That's keeping it very cold though, test started with GPU idle at 6c, max was like 27.5c.

Cables/connectors/backplate/etc not even warm to the touch.

As for warranty, if you own a GB card, well you don't have one anyways so....shunt away


----------



## TK421

Audioboxer said:


> EVGA have 3 pin


problem with FTW3 cards is that the power balance is wonky so the pcie slot gets loaded to 75w and limits the rest of the card

if you fix that with a different bios somehow, the comically small fuse on the pcie slot will blow first


----------



## Audioboxer

TK421 said:


> problem with FTW3 cards is that the power balance is wonky so the pcie slot gets loaded to 75w and limits the rest of the card
> 
> if you fix that with a different bios somehow, the comically small fuse on the pcie slot will blow first


The firmware updates must have fixed that because my PCIe slot is usually around 40-60w. Definitely not maxed out first.


----------



## Panchovix

TK421 said:


> problem with FTW3 cards is that the power balance is wonky so the pcie slot gets loaded to 75w and limits the rest of the card
> 
> if you fix that with a different bios somehow, the comically small fuse on the pcie slot will blow first


TIL EVGA still uses small fuses on their top end cards, not even ASUS do it on their "low end" cards, like the TUF


----------



## tcclaviger

That comically small fuse is good to 120 watts on the PCIe slot. The problem isn't the fact that it's fused, the problem is the balance, aka ratio between PCIe slot and PCIe 8pins.

On 20/40/40 2x8pin cards ~600 watts is the danger point for fuses.
On ideals 3x8pin balance it's 14.2/28.6/28.6/28.6, it's ~840 watts before fuse risk.
On cards that cap PCIe and pull whatever is demanded from the 3x8 pins the limit is the cable/connector limit since they're not fused (Strix, HOF etc).

EVGA screwed up the 14 2/28.6x3 balance, that's the issue.


----------



## TK421

tcclaviger said:


> That comically small fuse is good to 120 watts on the PCIe slot. The problem isn't the fact that it's fused, the problem is the balance, aka ratio between PCIe slot and PCIe 8pins.
> 
> On 20/40/40 2x8pin cards ~600 watts is the danger point for fuses.
> On ideals 3x8pin balance it's 14.2/28.6/28.6/28.6, it's ~840 watts before fuse risk.
> On cards that cap PCIe and pull whatever is demanded from the 3x8 pins the limit is the cable/connector limit since they're not fused (Strix, HOF etc).
> 
> EVGA screwed up the 14 2/28.6x3 balance, that's the issue.





Panchovix said:


> TIL EVGA still uses small fuses on their top end cards, not even ASUS do it on their "low end" cards, like the TUF


do you think the small fuse is put in place as a bandaid fix for the bad power balancing?

evga voltage controller is fully analog so there's no way a firmware update can fix the power balance issues

on their forum, I think some posts were removed in order to censor/hide the power balance issues, the direct post links don't work anymore and search has disappeared off DDG/google


----------



## yzonker

tcclaviger said:


> That comically small fuse is good to 120 watts on the PCIe slot. The problem isn't the fact that it's fused, the problem is the balance, aka ratio between PCIe slot and PCIe 8pins.
> 
> On 20/40/40 2x8pin cards ~600 watts is the danger point for fuses.
> On ideals 3x8pin balance it's 14.2/28.6/28.6/28.6, it's ~840 watts before fuse risk.
> On cards that cap PCIe and pull whatever is demanded from the 3x8 pins the limit is the cable/connector limit since they're not fused (Strix, HOF etc).
> 
> EVGA screwed up the 14 2/28.6x3 balance, that's the issue.


Some of the EVGA cards only have an 8 amp fuse on the pcie. That's what my 3080ti FTW3 has. I'm pretty sure the redesigned 3090 ftw3 also is 8 amps.


----------



## Audioboxer

Found a good way to automate Metro testing, just hang out on the train at the front. The scenery constantly loops and it's fairly taxing to weed out instability without having to be there to play the game.

Obviously it's only using the same scenery to cycle, but it's a decent way to automate something that isn't just leaving yourself idle in a room.


----------



## tcclaviger

yzonker said:


> Some of the EVGA cards only have an 8 amp fuse on the pcie. That's what my 3080ti FTW3 has. I'm pretty sure the redesigned 3090 ftw3 also is 8 amps.


Well hell, eating my words. 8a is...just too small for serious OC without compensation of some sort to bypass. Had no idea they went quite so small on it.

Good to know, thanks.


----------



## yzonker

tcclaviger said:


> Well hell, eating my words. 8a is...just too small for serious OC without compensation of some sort to bypass. Had no idea they went quite so small on it.
> 
> Good to know, thanks.


Hard to see in this pick, but it's down next to the shunt. You can easily tell it's a single digit number though.



https://www.techpowerup.com/review/evga-geforce-rtx-3080-ti-ftw3-ultra/images/back.jpg


----------



## mouacyk

At the absolute limits:


----------



## zebra_hun

mouacyk said:


> At the absolute limits:
> View attachment 2553420
> 
> 
> View attachment 2553421
> 
> 
> View attachment 2553422
> 
> 
> View attachment 2553423
> 
> View attachment 2553424
> 
> 
> View attachment 2553425


Congrats  very very nice job
My scores with original Giga Gaming OC
Link


----------



## Panchovix

mouacyk said:


> At the absolute limits:
> View attachment 2553420
> 
> 
> View attachment 2553421
> 
> 
> View attachment 2553422
> 
> 
> View attachment 2553423
> 
> View attachment 2553424
> 
> 
> View attachment 2553425


Damn 20600 on TimeSpy! What a good overclocker, congratz man!


----------



## Middleman

Just installed the Asus Tuf 3080 12GB OC edition. Loving this card, enjoy how the ram is on single side of the PCB. 

So one issue, games were crashing the system, hard reset. i wasn't sure what was wrong. 

Turns out i had to lower gpu speed by -100mhz, and now testing with -75mhz. Card is boosting past advertised speed i believe. 

Right now its running 1965mhz.


----------



## dk10438

Middleman said:


> Just installed the Asus Tuf 3080 12GB OC edition. Loving this card, enjoy how the ram is on single side of the PCB.
> 
> So one issue, games were crashing the system, hard reset. i wasn't sure what was wrong.
> 
> Turns out i had to lower gpu speed by -100mhz, and now testing with -75mhz. Card is boosting past advertised speed i believe.
> 
> Right now its running 1965mhz.


----------



## dk10438

TBH, that doesn’t seem right. You shouldn’t have to under clock the card out of the box.


----------



## Imprezzion

If it is at 1965 now after -100 offset that would mean stock boost is 2055-2070Mhz bin. That is quite optimistic for a 3080 especially if it's power limited and doesn't run full voltage (1.081-1.100v) at that speed. 

Still, I agree with above, a factory card should be stable at default settings at all times. 

I still have my Gigabyte Gaming OC 3080 with a Bykski full cover block that I bought at release for near MSRP and it's been super solid ever since. 

It's still pretty badly power limited even tho removing the stock cooler and RGB did free up some power budget. 2x8 pin "370w" BIOS which in practice is around 344w. That's where it starts to power throttle. 

I run it at a custom curve at 1965 @ 0.956v. That results in effective clocks around 1920Mhz at all times and it stays just under the power limit with the most demanding titles drawing around 330-335w so it never power throttles. The VRAM is beastly on this card, zero ECC even at +1400 but even with the full cover it gets pretty hot. VRAM runs around 72-76c at +1400. Core 48c, hotspot 56c. 

Since the PCB's are all the same and my block is 3080/3090 ready so has all the VRAM spots covered I kinda wanna upgrade to a 3080 Ti Gaming OC/Vision/Eagle just for the VRAM increase and a bit of extra horsepower but it wouldn't be a good investment right now..


----------



## mouacyk

Middleman said:


> Just installed the Asus Tuf 3080 12GB OC edition. Loving this card, enjoy how the ram is on single side of the PCB.
> 
> So one issue, games were crashing the system, hard reset. i wasn't sure what was wrong.
> 
> Turns out i had to lower gpu speed by -100mhz, and now testing with -75mhz. Card is boosting past advertised speed i believe.
> 
> Right now its running 1965mhz.


Advisable to run a gaming session with GPUz open, and capture all the different temperatures. Hopefully, it's just a high ambient, otherwise, your card might have a poor contact somewhere from factory.


----------



## Panchovix

Middleman said:


> Just installed the Asus Tuf 3080 12GB OC edition. Loving this card, enjoy how the ram is on single side of the PCB.
> 
> So one issue, games were crashing the system, hard reset. i wasn't sure what was wrong.
> 
> Turns out i had to lower gpu speed by -100mhz, and now testing with -75mhz. Card is boosting past advertised speed i believe.
> 
> Right now its running 1965mhz.


At stock the card shouldn't crash, a friend had an RTX 3080 Gaming Z Trio that boosted on stock at 2070Mhz or so, and never crashed.

Try some variety of games and see if it keeps happening, since you shouldn't need to underclock the card from stock, and also check the temps.


----------



## tcclaviger

Sounds familiar, seem to recall this from early 3080 10gb drivers.

Hopefully they sort you out.

You could log using HWinfo and find the point where it crashes, voltage and GPU speed then tweak your curve to accommodate to minimize the reduced area instead of dropping the whole curve.


----------



## mouacyk

Power efficiency scaling in port royal (custom 2560x1440, windowed, looped, 99% GPU usage):


----------



## fray_bentos

mouacyk said:


> Power efficiency scaling in port royal (custom 2560x1440, windowed, looped, 99% GPU usage):
> View attachment 2553905


Your data also suggest memory OC isn't worth it. I found the same early on, so don't bother as the risks of instability/memory failure from higher temps and extra power consumption is not worth it. Long gone are the days of Maxwell 900 series where you could get substantial gains from the memory.


----------



## Audioboxer

Been doing quite a bit of testing with voltages, mainly using Metro for how quick it can crash unstable clocks. Leave idle in the train. Constant 360w+ power draw at 3440x1440, Ultra RT and DLSS on quality.

What I've noticed with my card anyway is the ramp up for voltage requirement over 2000mhz on core.

2010mhz can get away with 0.95v. 2025mhz requires 0.975v. Anything above that and it's fast and high into 1.xxv territory.

So it seems the sweet spot is around reaching 2000mhz if you want to undervolt.










0.95v at 100% utilisation in Metro is pretty much a constant 37 degrees. 0.975v is 38 degrees, occasionally kissing 39 degrees. On water obviously.

Let the above loop for around an hour now.


----------



## fray_bentos

Audioboxer said:


> Been doing quite a bit of testing with voltages, mainly using Metro for how quick it can crash unstable clocks. Leave idle in the train. Constant 360w+ power draw at 3440x1440, Ultra RT and DLSS on quality.
> 
> What I've noticed with my card anyway is the ramp up for voltage requirement over 2000mhz on core.
> 
> 2010mhz can get away with 0.95v. 2025mhz requires 0.975v. Anything above that and it's fast and high into 1.xxv territory.
> 
> So it seems the sweet spot is around reaching 2000mhz if you want to undervolt.
> 
> View attachment 2554435
> 
> 
> 0.95v at 100% utilisation in Metro is pretty much a constant 37 degrees. 0.975v is 38 degrees, occasionally kissing 39 degrees. On water obviously.
> 
> Let the above loop for around an hour now.


0.95 V isn't undervolting. A 3080 at stock will settle around 900 mV to hit the stock 320 W power limit under a true GPU-bound load. Undervolt sweet spot is getting 98-100% of stock performance at 825-850 mV.


----------



## Panchovix

fray_bentos said:


> 0.95 V isn't undervolting. A 3080 at stock will settle around 900 mV to hit the stock 320 W power limit under a true GPU-bound load. Undervolt sweet spot is getting 98-100% of stock performance at 825-850 mV.


3080 is rated to use until 1.081V (If I'm not wrong) at stock, assuming you don't get power limited; so anything below that can be called an undervolt.

Also 0.825-0.85V IMO is pretty low, can you get 1905Mhz or more with that voltage?


----------



## fray_bentos

Panchovix said:


> 3080 is rated to use until 1.081V (If I'm not wrong) at stock, assuming you don't get power limited; so anything below 1V can be called an undervolt.
> 
> Also 0.825-0.85V IMO is pretty low, can you get 1905Mhz or more with that voltage?


Yes, but those higher voltages are only ever hit at low loads (where the power limit is not being hit), in generally non-GPU bound situations, where you get no benefit from the extra voltage/clocks.

831 mV gets me 1800 MHz sustained = 98-99% of stock performance (stock settles at ~1815 MHz under heavy load). That undervolt gets 220-270 W load, <65 C on air with quiet fans (not audible through headphones; I sit 1 metre from my PC).

To get 1920 MHz, I need 900 mV, but that bounces off the 320 W power limit in some loads, only gives 3-4% performance over stock, and the same noise as stock (which is too much for me). I'd rather take 98-99% of the performance for 70-100 W less power, a lot less noise (fan and coil whine), less chance of hardware failure, and cooler RAM sticks. Even with the 831 mV undervolt, the graphics card memory "junction temperature" still hits 85 C, which I understand is "cool" for a 3080.


----------



## Audioboxer

fray_bentos said:


> 0.95 V isn't undervolting. A 3080 at stock will settle around 900 mV to hit the stock 320 W power limit under a true GPU-bound load. Undervolt sweet spot is getting 98-100% of stock performance at 825-850 mV.


My understanding is anything under the default voltage curve would be classed as an undervolt? Because you're running the card at a lower voltage than default for the same boost clock. But if that's not what it traditionally means I've got no issue not saying undervolt lol.

I can see why most people doing aircooling are trying to get under 0.9v though to help with thermals. No point in me going that low on watercooling unless it's to try and drop wattage.


----------



## fray_bentos

Audioboxer said:


> My understanding is anything under the default voltage curve would be classed as an undervolt? Because you're running the card at a lower voltage than default for the same boost clock. But if that's not what it traditionally means I've got no issue not saying undervolt lol.
> 
> I can see why most people doing aircooling are trying to get under 0.9v though to help with thermals. No point in me going that low on watercooling unless it's to try and drop wattage.


It is semantics. However, if you are running a higher clock at a certain voltage than the stock V/F curve, but said voltage matches (or exceeds) what your card would be loaded with at stock, then that's a bog-standard overclock (the same or more voltage/power than stock to get more performance).

Ergo, it is a stretch to call any voltage that results in more power being pulled than the stock power limit of 320 W an undervolt (i.e. anything much over 900 mV).

Consistent with my own findings, Panchovix's 900 mV "undervolt" posted here pulls a peak of 330 W, which exceeds the 320 W stock power limit of a 3080:








RTX 3080 (shunt modded) comparison with Stock...


Hi there guys, maybe some known the post similar to this by me, of the 3060 Ti,[here RTX 3060Ti comparison with Stock, undervolt, overclock..., which was a 3060Ti Gaming OC Pro with 270W max TDP, I never shunted it either. Now, today I will do something similar (pretty similar), but with a TUF...




www.overclock.net





I agree, going lower than 900 mV on water cooling defeats the purpose of water cooling; more performance unlocked for more power, while keeping noise and temperatures in check.


----------



## acoustic

If the card stock requires 1.037mv for 2025, and I am now running 2025 @ .937mv, I have undervolted the card. This seems like argument just for arguments sake.


----------



## Audioboxer

acoustic said:


> If the card stock requires 1.037mv for 2025, and I am now running 2025 @ .937mv, I have undervolted the card. This seems like argument just for arguments sake.


That's the way I felt, but as I said I wasn't really looking for an argument or anything and if I'm wrong I have no problem leaving the term 'undervolting' for folks running like 0.8xv and things.

It's just an easy way for me to personally say I'm running this curve at less voltage than what it is running at with stock lol.

Because in terms of how these cards boost, a lot of it is just down to thermals. Yeah, voltage to be stable matters, but boosting to like 19xx/20xx and maintaining that is also a lot to do with cooling.

I've even read that if you're keeping really cool that might let you sneak in with slightly less voltage, but I'm not entirely sure if there is any truth in that. I personally just thought voltage required is purely silicon luck and not to do with heat. Boost capacity is more to do with heat.


----------



## yzonker

Audioboxer said:


> That's the way I felt, but as I said I wasn't really looking for an argument or anything and if I'm wrong I have no problem leaving the term 'undervolting' for folks running like 0.8xv and things.
> 
> It's just an easy way for me to personally say I'm running this curve at less voltage than what it is running at with stock lol.
> 
> Because in terms of how these cards boost, a lot of it is just down to thermals. Yeah, voltage to be stable matters, but boosting to like 19xx/20xx and maintaining that is also a lot to do with cooling.
> 
> I've even read that if you're keeping really cool that might let you sneak in with slightly less voltage, but I'm not entirely sure if there is any truth in that. I personally just thought voltage required is purely silicon luck and not to do with heat. Boost capacity is more to do with heat.


Cooler temps absolutely increases overclocking headroom. The catch is that NVIDIA boost takes advantage of most of it. 

I've gone from a bone stock Zotac 3090 trinity to the card in a custom loop capable of holding the core at 20-25C (chiller in the loop). For stable gaming, I'm still running the same core offset as I did when the card was bone stock (+150 core). The mem offset has increased quite a bit though since the mem does not change frequency automatically with temp.


----------



## mouacyk

Didn't realize I had to enter into the "Metro Exodus (ME) Enhanced Edition" beta channel to resolve some annoying issues: aspect ration resetting, gray filter, low RTX performance, and low mouse responsiveness. After another 60GB download/update, it's a hoot now. Feels pretty damn atmospheric, and exactly how Fallout 1/2 should be in FPS perspective.

Getting around 100fps at 3840x1600, Ultra settings, [email protected] curve, 22200MHz memory, max 370W, RTX high, DLSS - quality. This game and GPU really show cases each other.


----------



## Imprezzion

mouacyk said:


> Didn't realize I had to enter into the "Metro Exodus (ME) Enhanced Edition" beta channel to resolve some annoying issues: aspect ration resetting, gray filter, low RTX performance, and low mouse responsiveness. After another 60GB download/update, it's a hoot now. Feels pretty damn atmospheric, and exactly how Fallout 1/2 should be in FPS perspective.
> 
> Getting around 100fps at 3840x1600, Ultra settings, [email protected] curve, 22200MHz memory, max 370W, RTX high, DLSS - quality. This game and GPU really show cases each other.


100FPS at that res? Man when I played through it I was happy to get 60 at 1080p native lol. It is a great power limit tester tho. It slams the card hard. On 1080p I can just about squeeze 1935 @ 0.931 out of it but no more then that. 

Btw, does anyone have some tips on how exactly to set a custom curve so the effective clocks are as close as possible to real clocks? The way I do it usually results in a pretty big delta between the 2. So, my curve is 1995 @ 0.981 and then flat after. I did also raise the entire section with lower voltages by the same amount to make a nice curved line. Clocks usually show 1995Mhz readout and ~1956Mhz effective. If I just use offset clocks and let the card throttle itself to 1995 it shows like 1981 effective but at a higher voltage.


----------



## mouacyk

Imprezzion said:


> 100FPS at that res? Man when I played through it I was happy to get 60 at 1080p native lol. It is a great power limit tester tho. It slams the card hard. On 1080p I can just about squeeze 1935 @ 0.931 out of it but no more then that.
> 
> Btw, does anyone have some tips on how exactly to set a custom curve so the effective clocks are as close as possible to real clocks? The way I do it usually results in a pretty big delta between the 2. So, my curve is 1995 @ 0.981 and then flat after. I did also raise the entire section with lower voltages by the same amount to make a nice curved line. Clocks usually show 1995Mhz readout and ~1956Mhz effective. If I just use offset clocks and let the card throttle itself to 1995 it shows like 1981 effective but at a higher voltage.


I'm at the area where the train is initially stuck, and have to wonder around a swamp doing tasks to restore the bridge so the train can continue. I've seen lows in the 70's, but the average of 100fps was surprising to me. DLSS is on quality so effective res is something like 2560x1080 and I wonder if the beta channel (through GoG) is actually helping too.

What you described is exactly how I set up my curve, so it undervolts and overclocks at lower voltage points I know are stable. For me, it always drops 1 bin between what is set and what is effective.


----------



## Audioboxer

mouacyk said:


> Didn't realize I had to enter into the "Metro Exodus (ME) Enhanced Edition" beta channel to resolve some annoying issues: aspect ration resetting, gray filter, low RTX performance, and low mouse responsiveness. After another 60GB download/update, it's a hoot now. Feels pretty damn atmospheric, and exactly how Fallout 1/2 should be in FPS perspective.
> 
> Getting around 100fps at 3840x1600, Ultra settings, [email protected] curve, 22200MHz memory, max 370W, RTX high, DLSS - quality. This game and GPU really show cases each other.


On Steam? Says no betas available to me.


----------



## Micko

Imprezzion said:


> Btw, does anyone have some tips on how exactly to set a custom curve so the effective clocks are as close as possible to real clocks? The way I do it usually results in a pretty big delta between the 2. So, my curve is 1995 @ 0.981 and then flat after. I did also raise the entire section with lower voltages by the same amount to make a nice curved line. Clocks usually show 1995Mhz readout and ~1956Mhz effective. If I just use offset clocks and let the card throttle itself to 1995 it shows like 1981 effective but at a higher voltage.



The smoother the curve drop is, the lower the difference between nominal and effective clock. The way most youtube oc tutorials do is they use the v/f curve with sharp drop before the wanted frequency/voltage and while that is the simplest and fastest way to create a curve, it does tend to push nominal and effective clocks apart, and that difference grows with increase in nominal clock. What else I noticed is that effective clock is the clock that defines the performance of the card, i.e. 2 overclock profiles where first one has 2000/1950 nominal and effective clocks and second one has 1965/1950, both should give more or less the same fps in games and benchmarks.

I lost a silicon lottery with my 3080 so in the end i decided to settle with 1815Mhz at 0.818v. It runs nice and cool even during the warm summer months.










With this type of v/f curve, effective clock is 1 bin (15mhz) below the nominal clock 99% of time. Effective clock can and will jump 10-15mhz up and down but only for a split second. In my experience, this curve will give the lowest nominal/effective clock difference, but it requires a LOT more patience and testing than the usual sharp drop off curve from oc/uv tutorials.


----------



## mouacyk

Audioboxer said:


> On Steam? Says no betas available to me.


Mine's of GoG, and using Galaxy.


----------



## yzonker

Micko said:


> The smoother the curve drop is, the lower the difference between nominal and effective clock. The way most youtube oc tutorials do is they use the v/f curve with sharp drop before the wanted frequency/voltage and while that is the simplest and fastest way to create a curve, it does tend to push nominal and effective clocks apart, and that difference grows with increase in nominal clock. What else I noticed is that effective clock is the clock that defines the performance of the card, i.e. 2 overclock profiles where first one has 2000/1950 nominal and effective clocks and second one has 1965/1950, both should give more or less the same fps in games and benchmarks.
> 
> I lost a silicon lottery with my 3080 so in the end i decided to settle with 1815Mhz at 0.818v. It runs nice and cool even during the warm summer months.
> 
> View attachment 2554975
> 
> 
> With this type of v/f curve, effective clock is 1 bin (15mhz) below the nominal clock 99% of time. Effective clock can and will jump 10-15mhz up and down but only for a split second. In my experience, this curve will give the lowest nominal/effective clock difference, but it requires a LOT more patience and testing than the usual sharp drop off curve from oc/uv tutorials.


Interestingly, I've found the curve I show in the link below to always gain some stability even when effective clocks are the same. I still don't have a good explanation as to why though. 









[Official] NVIDIA RTX 3090 Owner's Club


Hi guys, Picked up a 3090 Asus Strix OC at Microcenter but having doubts if I should hold onto it as have not broken the plastic seal yet. Mostly game at 2560x1440. Currently selling 4 of my old 980 Tis and 1080s on eBay and have raised almost half the cost of the Strix but feel the price is...




www.overclock.net


----------



## Micko

Interesting indeed! If anything, i would expect that your first curve which has higher nominal clock should require higher voltage. Good to see you found a v/f curve that "clicked" with your card.


----------



## fray_bentos

Imprezzion said:


> 100FPS at that res? Man when I played through it I was happy to get 60 at 1080p native lol. It is a great power limit tester tho. It slams the card hard. On 1080p I can just about squeeze 1935 @ 0.931 out of it but no more then that.
> 
> Btw, does anyone have some tips on how exactly to set a custom curve so the effective clocks are as close as possible to real clocks? The way I do it usually results in a pretty big delta between the 2. So, my curve is 1995 @ 0.981 and then flat after. I did also raise the entire section with lower voltages by the same amount to make a nice curved line. Clocks usually show 1995Mhz readout and ~1956Mhz effective. If I just use offset clocks and let the card throttle itself to 1995 it shows like 1981 effective but at a higher voltage.


At 0.981 V it will be very hard not to hit the power limit in some loads, so it could be that. Otherwise, if you are sure powerlimit is not being hit, then not hitting set frequency is down to temperature, especially if on air. Every 5 C over a certain temperature the GPU will drop 15 MHz.

For example, my GPU runs at <65 C I get the effective frequency as 0-15 MHz below what I set (checked in HWiNFO64). If that goes up to 70 C then it's -30 MHz on average, 75 C -45 MHz and so on. At 0.981 V it will be hard to keep temperatures in the sweet spot without water cooling. The better effective clocks are why undervolting works so well.


----------



## Imprezzion

fray_bentos said:


> At 0.981 V it will be very hard not to hit the power limit in some loads, so it could be that. Otherwise, if you are sure powerlimit is not being hit, then not hitting set frequency is down to temperature, especially if on air. Every 5 C over a certain temperature the GPU will drop 15 MHz.
> 
> For example, my GPU runs at <65 C I get the effective frequency as 0-15 MHz below what I set (checked in HWiNFO64). If that goes up to 70 C then it's -30 MHz on average, 75 C -45 MHz and so on. At 0.981 V it will be hard to keep temperatures in the sweet spot without water cooling. The better effective clocks are why undervolting works so well.


I'm on water. Bykski full cover with a 420+240 rad. GPU never sees over 48c.

This is my curve. In very light games like world of tanks effective clocks are fine tho. But not in division 2 for example.

Note the weird drop of the blue line.


----------



## mouacyk

The power limit penalties are pretty severe, and behave cumulatively from my observations when my GPU was unshunted. First hitting it drops only 1 bin, but repeated hits increase the bin drops, then each 5C drops another bin. Just not sure if the algorithm compounds power limit and temp together for even worse drop.


----------



## fray_bentos

Imprezzion said:


> I'm on water. Bykski full cover with a 420+240 rad. GPU never sees over 48c.
> 
> This is my curve. In very light games like world of tanks effective clocks are fine tho. But not in division 2 for example.
> 
> Note the weird drop of the blue line.
> 
> View attachment 2555003


Ah OK, so PL and temps can be certainly ruled out for you! Does anyone know what the blue line means? On mine it is green (due to skin?), and has a weird downward spike at 831 mV; I've always ignored that line... I'm getting within 0-15 MHz of my set clock with this, so not sure the faint line is your problem.


----------



## Imprezzion

fray_bentos said:


> Ah OK, so PL and temps can be certainly ruled out for you! Does anyone know what the blue line means? On mine it is green (due to skin?), and has a weird downward spike at 831 mV; I've always ignored that line... I'm getting within 0-15 MHz of my set clock with this, so not sure the faint line is your problem.
> View attachment 2555005


I threw away all the old profiles and curves and re-did it again from scratch. It looks a LOT better now. It is getting slightly hotter as it's literally sitting within 2% of the power limit and I got my fans set very low (bottom of HWINFO shows my front rad fans being at 700RPM under gaming load. Might've set the PWM a bit too lazy in BIOS). This is while playing The Division 2 @ 1080P DX11 HDR10 enabled all maxed settings w/ ReShade + RTGI. About the heaviest load outside of Metro or Cyberpunk I can create lol. It is ever so slightly throttling now and then dropping 1 bin but effective stays pretty much the same. I'm pretty happy to see the card sustain >2000Mhz at 0.981. Based on my previous attemps this isn't a very good sample core wise... VRAM is at +1200. It is good on +1400 but it does take slightly more power then and won't allow me to run 0.981.


----------



## Audioboxer

How good is Kombustor as a stability test? It's quite easy to leave looping. Having a go to see where 1.0v can take me. At 2070/1.0v we're pegged to 2055mhz. Power draw around 400w.

Also trying memory with +1100. Was 1000 before. Seems I have Hynix chips, my 2080Ti was Samsung. Given my knowledge of b-die I'd always have assumed Samsung chips were preferable but these Hynix appear decent!

Edit - The MSI curve was actually at 2055, sometimes this thing has a life of its own! Bumped it back up to 2070 at 1.0v and it's looping now.


----------



## Panchovix

Audioboxer said:


> View attachment 2555238
> 
> 
> How good is Kombustor as a stability test? It's quite easy to leave looping. Having a go to see where 1.0v can take me. At 2070/1.0v we're pegged to 2055mhz. Power draw around 400w.
> 
> Also trying memory with +1100. Was 1000 before. Seems I have Hynix chips, my 2080Ti was Samsung. Given my knowledge of b-die I'd always have assumed Samsung chips were preferable but these Hynix appear decent!
> 
> Edit - The MSI curve was actually at 2055, sometimes this thing has a life of its own! Bumped it back up to 2070 at 1.0v and it's looping now.


I think all 3080 have Micron, I would be actually impressed if there is GDDR6X that is not from Micron lol.

2070Mhz at 1V is pretty, pretty good IMO, I do like 2025Mhz at 1V  wish my chip was good haha.

On mem you can probably go higher, I get more performance at +1600 or so, after that I start to get less performance, and at +1750-+1800Mhz I crash.


----------



## Audioboxer

Panchovix said:


> I think all 3080 have Micron, I would be actually impressed if there is GDDR6X that is not from Micron lol.
> 
> 2070Mhz at 1V is pretty, pretty good IMO, I do like 2025Mhz at 1V  wish my chip was good haha.
> 
> On mem you can probably go higher, I get more performance at +1600 or so, after that I start to get less performance, and at +1750-+1800Mhz I crash.


Entering noob territory here, but something I'm struggling with over my 2080Ti which I just set at 1.093v and the max stable frequency is pushing high frequency and high voltage struggles on this card at 1.093~1.1v.

Then again, my 2080Ti was at 2100, which is no problem for me here. Trying to push nearer 2200 while it locks in OK in MSI afterburner, if a high power load comes along like Kombustor or Metro it totally freaks out dropping frequency like mad.

Whereas around 2160 seems to be quite stable in Metro in terms of keeping the frequency solid at a higher voltage.

Is this just the limitations of pushing the core clock irrespective of voltage selected? I have seen some people manage just over 2200 on the core and I wouldn't mind testing that at 1.1v, but it seems if any heavy load hits my card it buckles. Are those folks shunt modded or something?

On a 450w bios, so it says, so I don't think it's power limited. I mean, Metro is _just _pulling around 380~400w.

I hate trying to work the afterburner curve at the best of times with it just randomly deciding to change itself, but I've repeated this behaviour across multiple apps and gaming.










For example if I take this into Metro, locked or unlocked, the card buckles completely and core clock jumps all over the place, even as low as 2025mhz.

But if I take this in










The worst it will drop down to is 2145mhz, and then lock itself there.

(I know memory is at default above, just figuring out core shenanigans right now).










(*edit* - quick pic with your memory +1600 suggestion lol, but as you can see will hold 2145mhz)

What is causing this? Shunt modding needed for higher than around 2160? And no, I don't really know what shunt modding is lol. Just aware of it from some surface level chat and videos on extreme GPU OCing.

Not something I'll be doing, just looking for an education on how a 450w bios combines with voltage and frequencies in terms of what limit comes first. Obviously my thermals are really low, so even I know it's not that.

My own tldr at the minute for myself is if I can enjoy 2160mhz at around 1.05v there is no need to think about 1.1v. As in, just because my 2080Ti needed maxed out at 1.093v, doesn't mean my 3080 is going to need maxed out at 1.1v.


----------



## Panchovix

I think you're still power limited (for some reason you card doesn't want to use more than 400W when it should), for 1.05V or more, you will need basically 400W or more (depending of the game), that would explain why the 2nd curve works "better", since you're not in the power limit territory.

IMO you don't need shunt modding (or well, you shouldn't) but I sense maybe a sensor, or one of the 3x8 pins, or the PCI-E itself is limiting the power, since you should be able to use 450W. (Maybe a FTW3 VBIOS Issue?

I will download Kombustor and test, my card is air cooled so the clocks will be a bit lower, but it will help to show the power consumption.


----------



## Audioboxer

Panchovix said:


> I think you're still power limited (for some reason you card doesn't want to use more than 400W when it should), for 1.05V or more, you will need basically 400W or more (depending of the game), that would explain why the 2nd curve works "better", since you're not in the power limit territory.
> 
> IMO you don't need shunt modding (or well, you shouldn't) but I sense maybe a sensor, or one of the 3x8 pins, or the PCI-E itself is limiting the power, since you should be able to use 450W. (Maybe a FTW3 VBIOS Issue?
> 
> I will download Kombustor and test, my card is air cooled so the clocks will be a bit lower, but it will help to show the power consumption.


Hmm, strange, something is up then. The highest I've seen GPU-Z report in terms of power draw is like 408w, so unless a reading is wrong (don't think so), it appears my card is struggling to actually draw up to 450w. These FTW3 cards have loads of posts online with people complaining about sensors/power load not balancing properly and so on, but EVGA rebuffed it all claiming there are no issues.

Kombustor tends to sit around 400w, so it's likely it is power-limited too and that's why it freaks out if I try to push high frequency/high voltage.

I guess I can try my power supply default power cables, I am using Corsair braided cables rather than the pretty awful standard Corsair power cables that comes with a HX1000i. No riser cable either, so it's not that. BIOS is set to GPU PCIe Gen 4 and resize bar is enabled.

*edit* - Yeah, Kombustor totally fails with that 2160mhz profile










Power draw limited, sitting at around 400w, frequency craters at 1.05v. I had it holding higher frequencies earlier, but that was around 1.0~1.025v.










Max power draw reports 409.3w, but this was clearly only a spike, sits at around 400w whilst running.

400w IIRC is like the max for this card as standard, so it's as if mine is ignoring the 450w BIOS and ignoring afterburner telling it to use up to 118%.

I guess the first thing to check would be PCIe cables?


----------



## Panchovix

Okay, here is a


Audioboxer said:


> Hmm, strange, something is up then. The highest I've seen GPU-Z report in terms of power draw is like 408w, so unless a reading is wrong (don't think so), it appears my card is struggling to actually draw up to 450w. These FTW3 cards have loads of posts online with people complaining about sensors/power load not balancing properly and so on, but EVGA rebuffed it all claiming there are no issues.
> 
> Kombustor tends to sit around 400w, so it's likely it is power-limited too and that's why it freaks out if I try to push high frequency/high voltage.
> 
> I guess I can try my power supply default power cables, I am using Corsair braided cables rather than the pretty awful standard Corsair power cables that comes with a HX1000i. No riser cable either, so it's not that. BIOS is set to GPU PCIe Gen 4 and resize bar is enabled.


I think you're correct, since now I managed to do the tests, and it seems to use, in my case 450-460W, and then I get power limited (by internal rails basically), here are some pics.
(I was using my 100% safe overclock, which is 2100Mhz at 1.093V and +1600MEM, but because temps went up, the clock settled for 2040-2055Mhz)



Spoiler: Kombustor on TUF 3080 Shunt modded




























Here in imgur, since for some reason OC compresses the images a lot:


http://imgur.com/a/MJySbbt





So about 200W per 8-Pin and 70W on the PCI-E, if your card is 3x8 it should be like 135W per 8-pin and 40-50W from the PCI-E.

If you want I can also try Metro Exodus Enhanced Edition, I think I have it out there somewhere lol


----------



## Audioboxer

Panchovix said:


> Okay, here is a
> 
> I think you're correct, since now I managed to do the tests, and it seems to use, in my case 450-460W, and then I get power limited (by internal rails basically), here are some pics.
> (I was using my 100% safe overclock, which is 2100Mhz at 1.093V and +1600MEM, but because temps went up, the clock settled for 2040-2055Mhz)
> 
> 
> 
> Spoiler: Kombustor on TUF 3080 Shunt modded
> 
> 
> 
> 
> View attachment 2555281
> 
> View attachment 2555282
> 
> View attachment 2555280
> 
> Here in imgur, since for some reason OC compresses the images a lot:
> 
> 
> http://imgur.com/a/MJySbbt
> 
> 
> 
> 
> 
> So about 200W per 8-Pin and 70W on the PCI-E, if your card is 3x8 it should be like 135W per 8-pin and 40-50W from the PCI-E.
> 
> If you want I can also try Metro Exodus Enhanced Edition, I think I have it out there somewhere lol


Nah it's fine, I'm getting power limited. Now I need to find out why. Is it cables? Is the card dodgy? Is the BIOS dodgy?

I guess the "easiest" thing to test first are the PCIe cables.


----------



## Imprezzion

Audioboxer said:


> Nah it's fine, I'm getting power limited. Now I need to find out why. Is it cables? Is the card dodgy? Is the BIOS dodgy?
> 
> I guess the "easiest" thing to test first are the PCIe cables.


My guess is BIOS. The 2x8 pin cards also all claim 370w in the BIOS but none of them actually run that. 345w is the hard limit.


----------



## Audioboxer

Imprezzion said:


> My guess is BIOS. The 2x8 pin cards also all claim 370w in the BIOS but none of them actually run that. 345w is the hard limit.


EVGA directly supplied it on their forums as a 450w BIOS lol...










No idea what else I could try, I have one of these damn LHR 3080's, so knowing what I can flash is a nightmare.

It's got a dual BIOS, but I'm still paranoid about bricking cards starting to flash BIOS from other manufacturers.

Will go have a moan on the EVGA forums and see if a member of staff can look at the LHR BIOS provided.


----------



## Panchovix

Imprezzion said:


> My guess is BIOS. The 2x8 pin cards also all claim 370w in the BIOS but none of them actually run that. 345w is the hard limit.


Kinda agree, since some cards (as the TUF itself) has a "375W" limit, which is exactly the same max theoretical as 2x8 pin and PCI-e (300W+75W), it is kinda expected to use less than that, but in 3x8 pin, your theoretical max limit is 525W (450W 3x8 + 75W PCI-E), so he isn't even near of that at 400W, which could be caused by the VBIOS.


Audioboxer said:


> EVGA directly supplied it on their forums as a 450w BIOS lol...
> 
> No idea what else I could try, I have one of these damn LHR 3080's, so knowing what I can flash is a nightmare.
> 
> It's got a dual BIOS, but I'm still paranoid about bricking cards starting to flash BIOS from other manufacturers.


You can flash any VBIOS which has the same device ID, EXCEPT Founders Edition (this only applies for Ampere), subsystem can be anything.

I think for the 3080 LHR the device ID is "10DE 2216", the FHR is "10DE 2206", so you can filter with that.

And relax, with double VBIOS you're pretty safe, and even so, if you have a spare GPU or using a multi GPU system (like on my case for machine learning), you can just use the other card to recover a bricked one (assuming you're using nvflash only)


----------



## acoustic

Audioboxer said:


> EVGA directly supplied it on their forums as a 450w BIOS lol...
> 
> View attachment 2555284
> 
> 
> No idea what else I could try, I have one of these damn LHR 3080's, so knowing what I can flash is a nightmare.
> 
> It's got a dual BIOS, but I'm still paranoid about bricking cards starting to flash BIOS from other manufacturers.
> 
> Will go have a moan on the EVGA forums and see if a member of staff can look at the LHR BIOS provided.


You're assuming that software readings are 100% dead-on accurate


----------



## Panchovix

acoustic said:


> You're assuming that software readings are 100% dead-on accurate


Oh this can also be the cause, for example the card may be actually using 450W, but reporting only 400W.


----------



## Audioboxer

Panchovix said:


> Kinda agree, since some cards (as the TUF itself) has a "375W" limit, which is exactly the same max theoretical as 2x8 pin and PCI-e (300W+75W), it is kinda expected to use less than that, but in 3x8 pin, your theoretical max limit is 525W (450W 3x8 + 75W PCI-E), so he isn't even near of that at 400W, which could be caused by the VBIOS.
> 
> You can flash any VBIOS which has the same device ID, EXCEPT Founders Edition (this only applies for Ampere), subsystem can be anything.
> 
> I think for the 3080 LHR the device ID is "10DE 2216", the FHR is "10DE 2206", so you can filter with that.
> 
> And relax, with double VBIOS you're pretty safe, and even so, if you have a spare GPU or using a multi GPU system (like on my case for machine learning), you can just use the other card to recover a bricked one (assuming you're using nvflash only)


I'll have a look then.

I sent a support ticket to EVGA with my proof to ask for support and also check if there are any other BIOS files to try.

In the meantime I will have a look on techpower and see if there are other LHR 3080 cards with 450w+ BIOS. Who else makes a 3 pin card?



acoustic said:


> You're assuming that software readings are 100% dead-on accurate


Here is quite a clear example










Coming in just under 400w, maintains 2100mhz at 1.0v.










Set to run at 2160 @ 1.05v, pegs at 400w, frequency craters to 2010mhz.

Given I've seen this










Which looks like a small spike to 409.3w, looks to me like software is fine. There is something wrong with the card/BIOS. There are other posters on the EVGA forums complaining the LHR bios either doesn't go above 400w, or one person mentioned 408w.


----------



## Panchovix

I was testing some games, and the advantages of undervolt to consume less, or the advantage to shunt mod to use the max amount of power is pretty amazing, there is some examples.
(This was made on PCI-E 4.0 X8, so it is less performance that it should be, and thus, less power draw as well; if it was X16 4.0, add 60W on each one)

Here in Control, on a static scene (All settings maxed, 1440p, no DLSS, RTX enabled), at "stock" (which in reality it is more than stock since the card is using 370-380W instead of the max 350W) was getting 61 FPS, 1.075V.










With undervolt at [email protected], and +1350Mhz on the memory, I get 63FPS at 280-290W (So 3% more performance with 100W less)










With undervolt, at [email protected], +1350 mem, 65FPS at 360W (6.5% more performance than "stock" while still consuming less)










Finally, with 2130Mhz locked core at 1.1V, +1500 on mem, I get 67-68 FPS, at the absurd amount of 420W (So 9.8% to 11.4% more performance than "stock")










Still haven't bought a block haha, it seems too hard to install, on a pc without any pump or anything  and also there is no AIO for the TUF 3080, so I guess I will just stick with the stock cooler.


----------



## Panchovix

Audioboxer said:


> In the meantime I will have a look on techpower and see if there are other LHR 3080 cards with 450w+ BIOS. Who else makes a 3 pin card?


The ones that comes from my mind now:

ASUS 3080 Strix
MSI 3080 SUPRIM X
Palit 3080 Gamerock
Colorful 3080 Vulkan X
Gigabyte 3080 Aorus Xtreme


----------



## Audioboxer

Panchovix said:


> The ones that comes from my mind now:
> 
> ASUS 3080 Strix
> MSI 3080 SUPRIM X
> Palit 3080 Gamerock
> Colorful 3080 Vulkan X
> Gigabyte 3080 Aorus Xtreme


Thanks, I'll have a look. I'm really hoping it's not an EVGA card fault/some sort of dumb LHR restriction on EVGA cards.


----------



## Audioboxer

Panchovix said:


> The ones that comes from my mind now:
> 
> ASUS 3080 Strix
> MSI 3080 SUPRIM X
> Palit 3080 Gamerock
> Colorful 3080 Vulkan X
> Gigabyte 3080 Aorus Xtreme











Asus RTX 3080 VBIOS


10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory




www.techpowerup.com





That should be OK to try? LHR device ID and it has 3 pin.


----------



## Panchovix

Audioboxer said:


> Asus RTX 3080 VBIOS
> 
> 
> 10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> That should be OK to try? LHR device ID and it has 3 pin.


It should, but maybe some monitor outputs may not work (you have to try basically), but at least 2DP and 1 HDMI will work 100%


----------



## Audioboxer

Panchovix said:


> It should, but maybe some monitor outputs may not work (you have to try basically), but at least 2DP and 1 HDMI will work 100%


It worked fine in terms of loading, but it struggled to go over 330w lmao.

Quite clearly an EVGA BIOS is needed and I'm going to have to chase CS on why they haven't fixed their power balancing issues.

*edit* - Oooft, RIP, seems all EVGA 3xxx cards are designed poorly https://forums.evga.com/EVGA-is-aware-of-the-power-balancing-issues-and-430w-cap-m3227786.aspx & https://forums.evga.com/Power-Load-Balance-3080-FTW3-m3304176.aspx

The 3rd pin is basically a scam. So it's basically a 400w card, end of story.


----------



## fray_bentos

Audioboxer said:


> It worked fine in terms of loading, but it struggled to go over 330w lmao.
> 
> Quite clearly an EVGA BIOS is needed and I'm going to have to chase CS on why they haven't fixed their power balancing issues.
> 
> *edit* - Oooft, RIP, seems all EVGA 3xxx cards are designed poorly https://forums.evga.com/EVGA-is-aware-of-the-power-balancing-issues-and-430w-cap-m3227786.aspx & https://forums.evga.com/Power-Load-Balance-3080-FTW3-m3304176.aspx
> 
> The 3rd pin is basically a scam. So it's basically a 400w card, end of story.


Kind of a con, but the real "con" is that pumping the voltage/power on the 3000 series is just not worth it, as @Panchovix's post further up this page illustrates; the 881 mV example is the clear sweet spot. Set a voltage of 825-900 mV, find the max clocks for that voltage, enjoy the reasonable power consumption, quiet noise, high frames, and move on.


----------



## Audioboxer

fray_bentos said:


> Kind of a con, but the real "con" is that pumping the voltage/power on the 3000 series is just not worth it, as @Panchovix's post further up this page illustrates; the 881 mV example is the clear sweet spot. Set a voltage of 825-900 mV, find the max clocks for that voltage, enjoy the reasonable power consumption, quiet noise, high frames, and move on.


I'm watercooled, I don't have any noise lol.

I could quite easily pull off 1.1v/450w if the EVGA cards weren't what they are. Seemingly they were designed as 2x8 pin, then EVGA decided at the last minute to go 3 pin but did some cheap 6 pin to 8 pin nonsense for the 3rd connector. So it's somewhat of a fake 3 pin. It can do 400w, unlike the 380w max you'd often see on 2 pin, but it can't balance itself properly to consistently do above 400w.

Jacob was supposed to comment on it in 2021, but there was no follow up. So basically, sweep under the rug, did you hear about our Ti versions of our cards with good power balancing?

This was a step up, otherwise I'd recommend everyone stays clear of EVGA 3080/3090s. Only the Ti models seem to have been fixed.


----------



## Panchovix

Audioboxer said:


> It worked fine in terms of loading, but it struggled to go over 330w lmao.
> 
> Quite clearly an EVGA BIOS is needed and I'm going to have to chase CS on why they haven't fixed their power balancing issues.
> 
> *edit* - Oooft, RIP, seems all EVGA 3xxx cards are designed poorly https://forums.evga.com/EVGA-is-aware-of-the-power-balancing-issues-and-430w-cap-m3227786.aspx & https://forums.evga.com/Power-Load-Balance-3080-FTW3-m3304176.aspx
> 
> The 3rd pin is basically a scam. So it's basically a 400w card, end of story.


If it on other VBIOS it doesn't work, then yeah, it is a hardware issue probably, I don't know what does EVGA use to regulate power and voltage, but it seems to be different than all the other Ampere cards for some reason.

Man, it's a bummer, because the card is on water, 400W is basically nothing for that; if I was on water I would just use a overclock preset at 450-470W lol.

But that's really shady of EVGA, there aren't newer VBIOS for the 3080 EVGA FTW3 LHR? You can also keep trying the other VBIOS to check if one does works better than the others.


----------



## yzonker

Audioboxer said:


> I'm watercooled, I don't have any noise lol.
> 
> I could quite easily pull off 1.1v/450w if the EVGA cards weren't what they are. Seemingly they were designed as 2x8 pin, then EVGA decided at the last minute to go 3 pin but did some cheap 6 pin to 8 pin nonsense for the 3rd connector. So it's somewhat of a fake 3 pin. It can do 400w, unlike the 380w max you'd often see on 2 pin, but it can't balance itself properly to consistently do above 400w.
> 
> Jacob was supposed to comment on it in 2021, but there was no follow up. So basically, sweep under the rug, did you hear about our Ti versions of our cards with good power balancing?
> 
> This was a step up, otherwise I'd recommend everyone stays clear of EVGA 3080/3090s. Only the Ti models seem to have been fixed.


Pretty much all 3x8pin 3080Ti's suffer from that too, not just EVGA. The 3080ti FTW3 is kinda saved by the fact that they pull about 30w more than reported (confirmed by myself with clamp meter along with one or 2 others). 2nd 8pin is even hotter than what it reports. So from that standpoint it does hit 450w, but you'll see 400-420w in GPUZ/HWinfo most of the time gaming. You also have the Galax 1kw bios if you want to daily a XOC bios.


----------



## yzonker

Panchovix said:


> If it on other VBIOS it doesn't work, then yeah, it is a hardware issue probably, I don't know what does EVGA use to regulate power and voltage, but it seems to be different than all the other Ampere cards for some reason.
> 
> Man, it's a bummer, because the card is on water, 400W is basically nothing for that; if I was on water I would just use a overclock preset at 450-470W lol.
> 
> But that's really shady of EVGA, there aren't newer VBIOS for the 3080 EVGA FTW3 LHR? You can also keep trying the other VBIOS to check if one does works better than the others.


Question is, was it really not pulling more than 330w, or are the readings just borked because the bios came from a card with a different controller? (or are the readings borked and causing it to not pull more than 330w?)


----------



## Panchovix

yzonker said:


> Question is, was it really not pulling more than 330w, or are the readings just borked because the bios came from a card with a different controller? (or are the readings borked and causing it to not pull more than 330w?)


It may the case, since it is a different controller, @Audioboxer would have to basically test if he gets the same clocks at these "330W", basically if it was like a shunt mod.

And wow, it isn't kinda dangerous that 1 of the 3 cables gets more hotter than the other one? How you would manage to fix that? (I guess only EVGA can do it)


----------



## Audioboxer

yzonker said:


> Pretty much all 3x8pin 3080Ti's suffer from that too, not just EVGA. The 3080ti FTW3 is kinda saved by the fact that they pull about 30w more than reported (confirmed by myself with clamp meter along with one or 2 others). 2nd 8pin is even hotter than what it reports. So from that standpoint it does hit 450w, but you'll see 400-420w in GPUZ/HWinfo most of the time gaming. You also have the Galax 1kw bios if you want to daily a XOC bios.












But here is an example of an EVGA card working properly. Their forums are absolutely plagued with commentary on EVGA using crappy components. The fact one card can do it and others can't means there is clearly something wrong in the design.

Most people with issues see their Pin 3 limited to around 75w. As you can see in that screenshot above 120w is achieved.


----------



## yzonker

Audioboxer said:


> View attachment 2555326
> 
> 
> But here is an example of an EVGA card working properly. Their forums are absolutely plagued with commentary on EVGA using crappy components. The fact one card can do it and others can't means there is clearly something wrong in the design.
> 
> Most people with issues see their Pin 3 limited to around 75w. As you can see in that screenshot above 120w is achieved.


So that's your card being driven to 450w by Furmark? That works on the 3080ti's too if that's what you are showing. For the 3080ti's, they are hitting an internal rail limit, but not the 8pins or pcie. Something else, maybe the memory rail.

Might try this Gigabyte as I think they use the same controller (3090 does anyway),









Gigabyte RTX 3080 VBIOS


10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory




www.techpowerup.com





Or MSI,









MSI RTX 3080 VBIOS


10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory




www.techpowerup.com





Or Palit, 









Palit RTX 3080 VBIOS


10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory




www.techpowerup.com





Those might have a better chance if the controller is the same as the 3090. (the first post in this thread doesn't show the controller for those brands)


----------



## yzonker

Panchovix said:


> It may the case, since it is a different controller, @Audioboxer would have to basically test if he gets the same clocks at these "330W", basically if it was like a shunt mod.
> 
> And wow, it isn't kinda dangerous that 1 of the 3 cables gets more hotter than the other one? How you would manage to fix that? (I guess only EVGA can do it)


Nah, the highest I saw on the 2nd 8pin while running Kombustor was 180-190w. Won't hurt anything. Now I have seen 250w on that 2nd 8pin running the Galax 1kw bios.


----------



## yzonker

One more post and I'll quit spamming... 

Doesn't look like you would be limiting on either the 8pins (175w) or PCIE (78w). Would have to be something else.


----------



## Audioboxer

Here's something interesting... I was given some advice elsewhere that my card might be underreporting. Person picked up I had a HX1000i and told me to check the 12v power reading in iCUE










At default sits around 80w, with tiny fluctuations.










Kombustor profile 1, software reads 387w, 12v power in iCUE reads 510w.

510w - 80w = 430w.










Kombustor profile 2, software is pegged at 400w, 12v power in iCUE reads 516~522w.

522w - 80w = 442w.

Now, this likely isn't an "exact science" doing what I'm doing with software from the power supply. I don't think anything else runs off the 12v either? As in, would anything else add to that wattage under a Kombustor load on the 12v power? Is the CPU connector off the 12v power? Kombustor puts a mild load on the CPU (not too much).

Edit - CPU is indeed off the 12v power so need to factor in it. Which means my 400w ceiling is likely being reported correctly.



yzonker said:


> So that's your card being driven to 450w by Furmark? That works on the 3080ti's too if that's what you are showing. For the 3080ti's, they are hitting an internal rail limit, but not the 8pins or pcie. Something else, maybe the memory rail.
> 
> Might try this Gigabyte as I think they use the same controller (3090 does anyway),
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gigabyte RTX 3080 VBIOS
> 
> 
> 10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Or MSI,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI RTX 3080 VBIOS
> 
> 
> 10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Or Palit,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Palit RTX 3080 VBIOS
> 
> 
> 10 GB GDDR6X, 1440 MHz GPU, 1188 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Those might have a better chance if the controller is the same as the 3090. (the first post in this thread doesn't show the controller for those brands)


That picture/card above isn't mine, someone else supplied that to me as an example of a 3080 FTW3 correctly drawing 450w.


----------



## Astral85

Has anyone tried any non Asus VBIOS on the ROG Strix 3080? My Strix 3080 refused the EVGA XOC BIOS.


----------



## Audioboxer

Astral85 said:


> Has anyone tried any non Asus VBIOS on the ROG Strix 3080? My Strix 3080 refused the EVGA XOC BIOS.


Flashing with -6 in nvflash, yeah?


----------



## Audioboxer

Ok, clearest example from me the EVGA 450w BIOS does absolutely nothing and/or these cards are broken and should be avoided

400w standard BIOS










450w OC BIOS










There is absolutely no difference, pegged at 400w for both. Either the hardware is faulty and EVGA have lied about a 3 pin power configuration, or their BIOS is faulty. Given nothing has been updated with this BIOS release since last year, looks like a sweep under the rug hardware fault/power delivery lie on EVGA cards.


----------



## Astral85

Audioboxer said:


> Flashing with -6 in nvflash, yeah?


Nope, the EVGA XOC BIOS is a .exe.


----------



## Astral85

Would anybody have any idea why some games won't use more GPU power? I have two or three games where the GPU won't draw anymore than 330W (or so). These are games with Ray Tracing enabled and mostly maxed out graphics settings. GPU usage in most cases is 80-99% but the card won't go anywhere near the power limit. This seems like a lot of available card performance not being used... The only game I've seen pull nearly all of my Strix 3080's 450W of power is Control.


----------



## Audioboxer

Astral85 said:


> Nope, the EVGA XOC BIOS is a .exe.


That won't work as it's made exclusively for EVGA cards when it's an exe package like that.

You have to find a BIOS here TechPowerUp and flash it manually with nvflash64 and the -6 prefix.


----------



## Astral85

Audioboxer said:


> Ok, clearest example from me the EVGA 450w BIOS does absolutely nothing and/or these cards are broken and should be avoided
> 
> 400w standard BIOS
> 
> View attachment 2555414
> 
> 
> 450w OC BIOS
> 
> View attachment 2555415
> 
> 
> There is absolutely no difference, pegged at 400w for both. Either the hardware is faulty and EVGA have lied about a 3 pin power configuration, or their BIOS is faulty. Given nothing has been updated with this BIOS release since last year, looks like a sweep under the rug hardware fault/power delivery lie on EVGA cards.


Try some of the 3D Mark test suite, Timespy, Firestrike etc. See if they will get you up to 450W.


----------



## Audioboxer

Astral85 said:


> Try some of the 3D Mark test suite, Timespy, Firestrike etc. See if they will get you up to 450W.


Nah, 3D Mark falls even further, things like Furmark/Kombustor really try to push the power draw. Timespy can sometimes be decent/get you some spikes, but if anything should max out power draw almost instantly it's something like Kombustor.

The card just won't draw more than around 400w.


----------



## acoustic

RMA or return the card. You're likely being limited by the #2 Pin being at 148w, which means the card is capping your power there. In reality your #2 pin is probably pulling 160w+, but the card isn't able to dictate that #2 pin is maxed and ask for more power from #1 or #3 or PCI slot - it simply says "I'm maxed on pin#2, so we are at our limit for board power draw" instead.

No BIOS or software is going to fix that issue, unfortunately.


----------



## Audioboxer

acoustic said:


> RMA or return the card. You're likely being limited by the #2 Pin being at 148w, which means the card is capping your power there. In reality your #2 pin is probably pulling 160w+, but the card isn't able to dictate that #2 pin is maxed and ask for more power from #1 or #3 or PCI slot - it simply says "I'm maxed on pin#2, so we are at our limit for board power draw" instead.
> 
> No BIOS or software is going to fix that issue, unfortunately.


EVGA are apparently refusing RMAs for any complaints about power draw just citing the card runs within spec. Waiting to hear what their explanation is for the 450w BIOS not working.

Somewhat reluctant to RMA anyway, you end up with an EVGA refurb card and after spending a few days reading about this now it seems lots of people face this issue. Given my card is recently manufactured I don't think I'd even trust an EVGA RMA not to simply return another card that does the same. All EVGA RMAs in Europe need to go to Germany now, so that costs me nearly £30 to ship the card to them. Could be £30 flushed if a refurb does the same.

This card seems pretty decent in terms of OCing as well, from lower voltages to being happy at 2170 / 1.05 and +1600 on memory with no crashing.

Though the memory side of things might be helped by cool temps. I do see most people on the EVGA forums getting around +1000. It was a poster in this topic who suggested I try 1600. No issues with it yet and I don't believe I'm seeing any performance loss (memory correction).


----------



## acoustic

Audioboxer said:


> EVGA are apparently refusing RMAs for any complaints about power draw just citing the card runs within spec. Waiting to hear what their explanation is for the 450w BIOS not working.
> 
> Somewhat reluctant to RMA anyway, you end up with an EVGA refurb card and after spending a few days reading about this now it seems lots of people face this issue. Given my card is recently manufactured I don't think I'd even trust an EVGA RMA not to simply return another card that does the same. All EVGA RMAs in Europe need to go to Germany now, so that costs me nearly £30 to ship the card to them. Could be £30 flushed if a refurb does the same.
> 
> This card seems pretty decent in terms of OCing as well, from lower voltages to being happy at 2170 / 1.05 and +1600 on memory with no crashing.
> 
> Though the memory side of things might be helped by cool temps. I do see most people on the EVGA forums getting around +1000. It was a poster in this topic who suggested I try 1600. No issues with it yet and I don't believe I'm seeing any performance loss (memory correction).


You can verify performance loss due to memory OC by running Heaven Benchmark in a window, pausing the camera (so it's a still screen, but still actively rendering) and adjusting memory clock. Eventually you'll hit a point where the FPS stops going up, and then usually ride that out for a bit, and then it'll start decreasing. My 3080TI STRIX can do +1600 for benching, but once the card warms up, I'm around +1500. I run +1000 for 24/7 use, as the power consumption for added memory OC does not seem to outweigh the core OC.

2170 @ 1.05mv is incredibly good if that's actually stable. I'd suspect it's not in more stressful workloads, though. I run .937mv @ 2010 for 24/7 use. I can go higher obviously since I'm on water and with a 450watt BIOS, but truthfully dumping 450watts into the loop 24/7 (at 4K, I'm usually pushing 400w+ even at .937mv) just makes for a lot of heat without much performance gain.

As for RMAing and getting a refurb.. yep. I wouldn't stress the power issue much - if you really want to go down the rabbit hole, you could just shunt-mod the 3rd PCIE connector for a small ~50-100watt gain to trick the card into balancing it's power better.

I had a 3080 FTW3 before I handed that down and grabbed the 3080TI STRIX. My 3080FTW3 had zero power balancing issues; if anything, it would suck down 475-480watts in short bursts. The card died under normal usage, though. One day it just showed red lights when powering the PC up, and said goodnight.. lol


----------



## Audioboxer

acoustic said:


> You can verify performance loss due to memory OC by running Heaven Benchmark in a window, pausing the camera (so it's a still screen, but still actively rendering) and adjusting memory clock. Eventually you'll hit a point where the FPS stops going up, and then usually ride that out for a bit, and then it'll start decreasing. My 3080TI STRIX can do +1600 for benching, but once the card warms up, I'm around +1500. I run +1000 for 24/7 use, as the power consumption for added memory OC does not seem to outweigh the core OC.
> 
> 2170 @ 1.05mv is incredibly good if that's actually stable. I'd suspect it's not in more stressful workloads, though. I run .937mv @ 2010 for 24/7 use. I can go higher obviously since I'm on water and with a 450watt BIOS, but truthfully dumping 450watts into the loop 24/7 (at 4K, I'm usually pushing 400w+ even at .937mv) just makes for a lot of heat without much performance gain.
> 
> As for RMAing and getting a refurb.. yep. I wouldn't stress the power issue much - if you really want to go down the rabbit hole, you could just shunt-mod the 3rd PCIE connector for a small ~50-100watt gain to trick the card into balancing it's power better.
> 
> I had a 3080 FTW3 before I handed that down and grabbed the 3080TI STRIX. My 3080FTW3 had zero power balancing issues; if anything, it would suck down 475-480watts in short bursts. The card died under normal usage, though. One day it just showed red lights when powering the PC up, and said goodnight.. lol


Graphics score is showing 19547 with 1600 I scored 18 974 in Time Spy

I've played a few hours of Metro and its been running a fairly lengthy amount of Kombustor all in. Usually Metro Exodus with everything cranked to max/RT can crash things within minutes, if not 20~30 minutes.

But I think you're probably right on how the core should be fed more power in my instance by reducing the memory OC a little

19578 with 1200 I scored 18 984 in Time Spy

My biggest issue is the power draw. Everything is bouncing off 400w which results in the core often running at 2160, more likely 2130~2145 under any heavy load. Just can't sustain itself due to the card wanting to go past 400w.

Believe it or not actually having 450w would really help out. I've never shunt modded anything before so at this point I'd probably refrain from doing it unless its noob-proof. Last thing I need is a dead card lol.


----------



## acoustic

Audioboxer said:


> Graphics score is showing 19547 with 1600 I scored 18 974 in Time Spy
> 
> I've played a few hours of Metro and its been running a fairly lengthy amount of Kombustor all in. Usually Metro Exodus with everything cranked to max/RT can crash things within minutes, if not 20~30 minutes.
> 
> But I think you're probably right on how the core should be fed more power in my instance by reducing the memory OC a little
> 
> 19578 with 1200 I scored 18 984 in Time Spy
> 
> My biggest issue is the power draw. Everything is bouncing off 400w which results in the core often running at 2160, more likely 2130~2145 under any heavy load.
> 
> Believe it or not actually having 450w would really help out. I've never shunt modded anything before so at this point I'd probably refrain from doing it unless its noob-proof. Last thing I need is a dead card lol.


I'm surprised that it's running Metro Exodus, but considering how much power-draw it has, I can't imagine you're running anywhere near 1.05mv. I see dips down to .887mv in Metro Exodus: EE with RTX max. That game is ****ing brutal. You might see better numbers (even in benching) by manually editing your V/F curve and finding the max on your voltage points from .900mv to what you want to daily at 1.05mv. You'll likely break 20k in TS by doing this. It's time consuming, but easiest way is to lock the card to that voltage and then find the stable point. Very, very crucial for benching on a card that can't just use as much power as it wants like a KINGPIN or running a 1000w BIOS.

450w vs 400w would help for benching, sure .. for gaming? You're talking about 1-3% worst/best-case scenario. The card very quickly drops off the efficiency/perf-per-watt scale in the mid 300w range.


----------



## Audioboxer

acoustic said:


> I'm surprised that it's running Metro Exodus, but considering how much power-draw it has, I can't imagine you're running anywhere near 1.05mv. I see dips down to .887mv in Metro Exodus: EE with RTX max. That game is ****ing brutal. You might see better numbers (even in benching) by manually editing your V/F curve and finding the max on your voltage points from .900mv to what you want to daily at 1.05mv. You'll likely break 20k in TS by doing this. It's time consuming, but easiest way is to lock the card to that voltage and then find the stable point. Very, very crucial for benching on a card that can't just use as much power as it wants like a KINGPIN or running a 1000w BIOS.
> 
> 450w vs 400w would help for benching, sure .. for gaming? You're talking about 1-3% worst/best-case scenario. The card very quickly drops off the efficiency/perf-per-watt scale in the mid 300w range.












Metro just holds it. Some scenes might dip to 2145 if closer to 400w. Yes, the old take a picture of your screen lol. I recently got a GMMK Pro and haven't assigned a print screen key yet. Snipping tool is fine for desktop but in games needs that dedicated print screen!

I'm running 3440x1440 with DLSS, so it's not like 4K native or anything, which likely wouldn't reach such core clocks.

Yeah, you're likely right about games so I think I'll just be happy with what I've got and wait and see what EVGA say to my ticket anyway.


----------



## acoustic

I'm surprised to see such low wattage even at 3440x1440. I was running 3840x1600 for the longest and that was impossible to keep under 400watt even with DLSS enabled. Regardless.. for benching, you should check out scaling your lower voltage points, or find the lowest voltage point it hits at any point during the test, and start increasing them from there. You'll find another couple hundred points without a doubt.

I was also looking at a new keyboard.. the GMMK Pro looked nice but I like a keypad. I spilled coffee in my K95 Platinum, thought I did a damn good job getting it all up (was just a small splash..) .. well 24hrs later .. chunky chunky clicking on the shift, r-ctrl, alt, and r-win key.. lol. Grabbed a Steelseries APEX Pro to replace it - really like this variable switches, and the wrist-rest feels really nice.


----------



## Audioboxer

acoustic said:


> I'm surprised to see such low wattage even at 3440x1440. I was running 3840x1600 for the longest and that was impossible to keep under 400watt even with DLSS enabled. Regardless.. for benching, you should check out scaling your lower voltage points, or find the lowest voltage point it hits at any point during the test, and start increasing them from there. You'll find another couple hundred points without a doubt.
> 
> I was also looking at a new keyboard.. the GMMK Pro looked nice but I like a keypad. I spilled coffee in my K95 Platinum, thought I did a damn good job getting it all up (was just a small splash..) .. well 24hrs later .. chunky chunky clicking on the shift, r-ctrl, alt, and r-win key.. lol. Grabbed a Steelseries APEX Pro to replace it - really like this variable switches, and the wrist-rest feels really nice.


Yeah it can still bump up higher than that in some scenes, then you see the core drop a bit.

I feel in love with Boba U4 switches so a mechanical keyboard with swappable switches is what does it for me lol. Silent tacticles. I hate noisy keyboards! Quite a few mods went into the keyboard, it's not quite an out of the box experience. So, I still understand why the much larger market is prebuilts. The Apex Pro is nice, would definitely be up there for my prebuilt choice.


----------



## acoustic

I briefly looked at a custom keyboard and .. I have too many expensive hobbies already. Between custom WC and the crazy cost of hardware, the last thing I need to do is add another rabbit hole LOL


----------



## Audioboxer

acoustic said:


> I briefly looked at a custom keyboard and .. I have too many expensive hobbies already. Between custom WC and the crazy cost of hardware, the last thing I need to do is add another rabbit hole LOL


Custom keyboards can get absolutely ridiculous. I'd say the GMMK Pro is priced at the upper end of the budget market, more likely entry mid-tier. Because the switches and your keycaps of choice add a bit, then there is any mods with stabilisers/lubing/etc.

Whereas something like the Apex Pro, all-in, will be cheaper at retail and you don't need to mod anything.

I wouldn't say this will ever turn into a hobby for me, it's more a one and done. I have the keyboard I like, with a form factor I can work with and that's it. No more keyboards, keycaps, switches or anything lol. I do have another keyboard I sometimes use for number work, but most work I do on the main PC is typing, so a TKL style was always fine for me. No real need for a dedicated numpad.

Speaking of them, you can even buy quite expensive custom numpad minis and things. It's one heck of a rabbit hole to go down the custom keyboard world! lol


----------



## Audioboxer

acoustic said:


> You can verify performance loss due to memory OC by running Heaven Benchmark in a window, pausing the camera (so it's a still screen, but still actively rendering) and adjusting memory clock. Eventually you'll hit a point where the FPS stops going up, and then usually ride that out for a bit, and then it'll start decreasing. My 3080TI STRIX can do +1600 for benching, but once the card warms up, I'm around +1500. I run +1000 for 24/7 use, as the power consumption for added memory OC does not seem to outweigh the core OC.
> 
> 2170 @ 1.05mv is incredibly good if that's actually stable. I'd suspect it's not in more stressful workloads, though. I run .937mv @ 2010 for 24/7 use. I can go higher obviously since I'm on water and with a 450watt BIOS, but truthfully dumping 450watts into the loop 24/7 (at 4K, I'm usually pushing 400w+ even at .937mv) just makes for a lot of heat without much performance gain.
> 
> As for RMAing and getting a refurb.. yep. I wouldn't stress the power issue much - if you really want to go down the rabbit hole, you could just shunt-mod the 3rd PCIE connector for a small ~50-100watt gain to trick the card into balancing it's power better.
> 
> I had a 3080 FTW3 before I handed that down and grabbed the 3080TI STRIX. My 3080FTW3 had zero power balancing issues; if anything, it would suck down 475-480watts in short bursts. The card died under normal usage, though. One day it just showed red lights when powering the PC up, and said goodnight.. lol





















Heaven be like, +2000 on memory? I don't even care 

Letting it run at 2000 just for a laugh, see if it can finally crash.


----------



## Panchovix

Audioboxer said:


> My biggest issue is the power draw. Everything is bouncing off 400w which results in the core often running at 2160, more likely 2130~2145 under any heavy load. Just can't sustain itself due to the card wanting to go past 400w.


That kinda does affect 3DMark scores, since it is downclocking; on my TUF I use like 490W on TimeSpy, specially on the 2nd test lol

I scored 18 150 in Time Spy (20001 graphics score)

Waiting colder days, on Autumn now in Chile, but it's still ambient temps of 20°c or so; on Winter my best shot will try on 5°C ambient haha.

Also I'm impressed, running my 3080 on PCI-E 4.0 X16, on Control with RTX maxed AND DLSS, I still use like 420W lol.




Audioboxer said:


> View attachment 2555439
> 
> 
> View attachment 2555440
> 
> 
> Heaven be like, +2000 on memory? I don't even care
> 
> Letting it run at 2000 just for a laugh, see if it can finally crash.


Pretty nice mem you got there, 0.7% bump is 0.7% bump for benchmarks.
Mine dies at +1650-+1700 or so, plain crashes.

If you want try higher, you can try nvidia-smi, with the command -lmc 11600 for example (that would be the same as +2050 offset)


----------



## Audioboxer

Panchovix said:


> That kinda does affect 3DMark scores, since it is downclocking; on my TUF I use like 490W on TimeSpy, specially on the 2nd test lol
> 
> I scored 18 150 in Time Spy (20001 graphics score)
> 
> Waiting colder days, on Autumn now in Chile, but it's still ambient temps of 20°c or so; on Winter my best shot will try on 5°C ambient haha.
> 
> Also I'm impressed, running my 3080 on PCI-E 4.0 X16, on Control with RTX maxed AND DLSS, I still use like 420W lol.
> 
> 
> 
> Pretty nice mem you got there, 0.7% bump is 0.7% bump for benchmarks.
> Mine dies at +1650-+1700 or so, plain crashes.
> 
> If you want try higher, you can try nvidia-smi, with the command -lmc 11600 for example (that would be the same as +2050 offset)












Still going 40 minutes later lol, no signs of issues.

Either this ECC memory is next-gen (I can correct anything you throw at me!!!), or it is quite OCable. Going to see if it can survive Kombuster. +2000 memory definitely begins to hurt core with a 400w wall.










Kombustor is instantly like, FEED ME MORE POWER! Core hurting hard.

I do notice that spike to 411w in HWINFO. It's not seen in the Rivatuner output, so it's obviously just a very quick burst to 411w. The fact this can happen though makes it very difficult to explain why the card can't power manage properly. It clearly _can_ go above 400w, but it's not doing it consistently and seemingly only in micro-bursts.

*edit* - Kombustor seems to be holding up fine, need a better way of testing memory. Guess I'll give Metro a spin! Error correction could be going on now, but _something_ should surely still crash lol.


----------



## aberrero

Just picked up a Zotac 12GB and getting pretty weak performance in 3DMark, below most 10G cards. Under 12k in PR and under 17k in Timespy. Even without an OC over stock, Afterburner says it is power limited most of the time. I have it set at 110% power limit, but I don't think the slider does anything, as I'm getting similar power draw and scores either way. 

I know people with 10G Zotac Trinity cards could flash Asus Strix OC 10GB BIOS for a higher power limit. Anyone tried the Strix OC 12G BIOS on the Zotac 12G.

I got the Zotac cause it can take a reference water block, but I want to make sure it has some headroom before I put it on water. I don't care about pushing 450W through it, but I would like a bit of headroom for a modest OC to bring it up to par with other stock OC cards.


----------



## Panchovix

aberrero said:


> Just picked up a Zotac 12GB and getting pretty weak performance in 3DMark, below most 10G cards. Under 12k in PR and under 17k in Timespy. Even without an OC over stock, Afterburner says it is power limited most of the time. I have it set at 110% power limit, but I don't think the slider does anything, as I'm getting similar power draw and scores either way.
> 
> I know people with 10G Zotac Trinity cards could flash Asus Strix OC 10GB BIOS for a higher power limit. Anyone tried the Strix OC 12G BIOS on the Zotac 12G.
> 
> I got the Zotac cause it can take a reference water block, but I want to make sure it has some headroom before I put it on water. I don't care about pushing 450W through it, but I would like a bit of headroom for a modest OC to bring it up to par with other stock OC cards.


It depends of the amount of PCI-E connectors, I'm not sure if it's there any Zotac card with 3x8 pin, because that (3x8pin) is the requirement to flash some of these higher power VBIOS.

If your card is 2x8pin, you're stuck at 350W max power limit, so the only way to bypass that is shunt mod; if you still have warranty, I wouldn't do that mod.


----------



## aberrero

Panchovix said:


> It depends of the amount of PCI-E connectors, I'm not sure if it's there any Zotac card with 3x8 pin, because that (3x8pin) is the requirement to flash some of these higher power VBIOS.
> 
> If your card is 2x8pin, you're stuck at 350W max power limit, so the only way to bypass that is shunt mod; if you still have warranty, I wouldn't do that mod.


Thank you, it's a 2x8pin. The 12GB card seems more power hungry than the 10GB card. At the same power limit, it seems to be giving worse performance, which is incredibly disappointing. I also have an EVGA FTW3U 3080Ti, and in the same system in PR I'm getting 2000 points less with the 3080 vs the Ti.

The people saying they were getting higher OC with the Asus bios are in this reddit thread, but I think the 12GB cars is supposed to have a 350W limit already:

__
https://www.reddit.com/r/ZOTAC/comments/j9aqo3


----------



## Panchovix

aberrero said:


> Thank you, it's a 2x8pin. The 12GB card seems more power hungry than the 10GB card. At the same power limit, it seems to be giving worse performance, which is incredibly disappointing. I also have an EVGA FTW3U 3080Ti, and in the same system in PR I'm getting 2000 points less with the 3080 vs the Ti.
> 
> The people saying they were getting higher OC with the Asus bios are in this reddit thread, but I think the 12GB cars is supposed to have a 350W limit already:
> 
> __
> https://www.reddit.com/r/ZOTAC/comments/j9aqo3


Here it is the 3080 TUF 12GB VBIOS if you want to try, default is 350W and max TDP is 375W, I'm not sure if you will manage to reach that tho








Asus RTX 3080 VBIOS


12 GB GDDR6X, 1260 MHz GPU, 1188 MHz Memory




www.techpowerup.com




(Note it is unverified, so make a backup for flashing and such)

Also I agree, the 3080 12GB = 3080Ti basically, it performs better than the 10GB (or well, it should)


----------



## aberrero

Panchovix said:


> Here it is the 3080 TUF 12GB VBIOS if you want to try, default is 350W and max TDP is 375W, I'm not sure if you will manage to reach that tho
> 
> 
> 
> 
> 
> 
> 
> 
> Asus RTX 3080 VBIOS
> 
> 
> 12 GB GDDR6X, 1260 MHz GPU, 1188 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> (Note it is unverified, so make a backup for flashing and such)
> 
> Also I agree, the 3080 12GB = 3080Ti basically, it performs better than the 10GB (or well, it should)


I was planning on flashing this one: Asus RTX 3080 VBIOS

It's 390/450, but for a 3x8 card. The first poster in the reddit thread said they were using the 10gb Strix OC bios and hitting 440W on their 2x8pin Zotac.

The review samples for the 3080 12GB were all for the MSI Suprim card saying it was almost as fast as the 3080 Ti. I'm very disappointed with the Zotac sitting about 15% below a 3080Ti with stock OC.


----------



## acoustic

3x8pin BIOS on a 2x8pin card does not work. The software just mimics the 2nd (might be 1st? I forget) 8pin as the 3rd. People on Reddit are morons.

He thinks he's hitting 440w, but in reality he's likely at the same or lower PL as the stock BIOS.


----------



## Panchovix

aberrero said:


> I was planning on flashing this one: Asus RTX 3080 VBIOS
> 
> It's 390/450, but for a 3x8 card. The first poster in the reddit thread said they were using the 10gb Strix OC bios and hitting 440W on their 2x8pin Zotac.
> 
> The review samples for the 3080 12GB were all for the MSI Suprim card saying it was almost as fast as the 3080 Ti. I'm very disappointed with the Zotac sitting about 15% below a 3080Ti with stock OC.


Won't work sadly, 3x8 pin VBIOS on 2x8 card will give bad reading, his 440W are probably 440W-1x8pin (-150), so in reality he is probably pulling 290-300W, less than stock.


----------



## aberrero

Thanks for the tips. I knew I should ask here before trusting reddit posts.

I spent some time working on an undervolt, which is giving me more respectable framerates within the power limits, in line with what I would expect from a 3080. I might shunt it in the future.

Can anyone confirm if the card power limit includes the fans and RGB? I might get a bit more power headroom by putting it on a water block and saving ~10W of board power.



acoustic said:


> 3x8pin BIOS on a 2x8pin card does not work. The software just mimics the 2nd (might be 1st? I forget) 8pin as the 3rd. People on Reddit are morons.
> 
> He thinks he's hitting 440w, but in reality he's likely at the same or lower PL as the stock BIOS.





Panchovix said:


> Won't work sadly, 3x8 pin VBIOS on 2x8 card will give bad reading, his 440W are probably 440W-1x8pin (-150), so in reality he is probably pulling 290-300W, less than stock.


----------



## aberrero

<delete>


----------



## acoustic

aberrero said:


> Thanks for the tips. I knew I should ask here before trusting reddit posts.
> 
> I spent some time working on an undervolt, which is giving me more respectable framerates within the power limits, in line with what I would expect from a 3080. I might shunt it in the future.
> 
> Can anyone confirm if the card power limit includes the fans and RGB? I might get a bit more power headroom by putting it on a water block and saving ~10W of board power.


Yes, fans are pulling board power. RGB as well if I'm not mistaken.

Not only does getting rid of the fans help power limit, but lower temps = potentially lower voltages as well


----------



## Astral85

Is it true core clock will clock higher if memory clock is reduced with this series? Does core clock outweigh mem OC? I've been somewhat disappointed with the overclocking of my Strix 3080 OC considering it's water cooled. I drop my first voltage and clock bin at just 40C. Next bin down comes around 44C and the third around 46-47C. Clockwise this results in 2145, 2130 and 2115 (at around 46-48C). Do these seem like low temps for the card to be down binning?

I can run +1500 on the mem but the core clock will refuse anything over +140/145 on my card and crash.


----------



## acoustic

Astral85 said:


> Is it true core clock will clock higher if memory clock is reduced with this series? Does core clock outweigh mem OC? I've been somewhat disappointed with the overclocking of my Strix 3080 OC considering it's water cooled. I drop my first voltage and clock bin at just 40C. Next bin down comes around 44C and the third around 46-47C. Clockwise this results in 2145, 2130 and 2115 (at around 46-48C). Do these seem like low temps for the card to be down binning?
> 
> I can run +1500 on the mem but the core clock will refuse anything over +140/145 on my card and crash.


If you're hitting your power limit, then yes, reducing memory OC will assist in the core clock since slightly more power can be used to keep the voltage up.


----------



## Astral85

acoustic said:


> If you're hitting your power limit, then yes, reducing memory OC will assist in the core clock since slightly more power can be used to keep the voltage up.


I'm not on the power limit. I'm playing Watch Dogs Legions and the TDP sits around 85-90%. I'm unsure why the card crashes when I try to push the core clock any higher. I can only guess it's not a highly binned GPU which is strange for a Strix... PerfCap Reason = VRel. VOp.


----------



## Imprezzion

aberrero said:


> Thanks for the tips. I knew I should ask here before trusting reddit posts.
> 
> I spent some time working on an undervolt, which is giving me more respectable framerates within the power limits, in line with what I would expect from a 3080. I might shunt it in the future.
> 
> Can anyone confirm if the card power limit includes the fans and RGB? I might get a bit more power headroom by putting it on a water block and saving ~10W of board power.


Just an extra answer, yes fans and RGB draw board power. I went from stock 3 fan cooler + RGB to a full cover waterblock + external RGB controller and it saved quite a lot of power. 25-30w actually. I still can't run any higher then ~0.981v @ 2025Mhz curve limit but yeah, 2x8 pin 10G model.. I usually sit around 95-99% now (330-341w) but it does not throttle in effective clocks. I run "just" 1080p 280Hz which has benefits because in general higher resolution = more power draw. Cyberpunk with no DLSS and RT Psycho doesn't throttle at 1080p but if I bump it to 4K (super sampling / resolution scaling) even with DLSS Quality it throttles pretty bad.


----------



## Soulpatch

Okay, I've been searching and trying things for hours. I've seen posts all over the place about the black screen of death. But nobody seems to have a fix??? Finally got an EVGA RTX 3080ti and installed it. Now, whenever I boot the machine, it makes it all the way up to the Win10 login screen and then it goes black. Monitor power dims and I get a NO DP error. I've checked cables, run off of hdmi, swapped around DP ports on the card, disconnected the card and ran the onboard video (deleted the drivers for the card), downloaded older drivers, updated the entire system that may have been running something old, etc etc etc. Still no d*mn video when it gets to that point. Apparently EVGA knows about this, but I have yet to see a fix. Has anybody else had this issue?? I'd rather not RMA the card, since it means likely waiting another year for one, let alone removing the waterblock and re-installing the original fan system. But everything I've read online suggests there is a fix, but I have yet to find it. Does anybody have any ideas??? Little bit pissed after waiting all this time, dropping some coin on a paperweight.


----------



## acoustic

Soulpatch said:


> Okay, I've been searching and trying things for hours. I've seen posts all over the place about the black screen of death. But nobody seems to have a fix??? Finally got an EVGA RTX 3080ti and installed it. Now, whenever I boot the machine, it makes it all the way up to the Win10 login screen and then it goes black. Monitor power dims and I get a NO DP error. I've checked cables, run off of hdmi, swapped around DP ports on the card, disconnected the card and ran the onboard video (deleted the drivers for the card), downloaded older drivers, updated the entire system that may have been running something old, etc etc etc. Still no d*mn video when it gets to that point. Apparently EVGA knows about this, but I have yet to see a fix. Has anybody else had this issue?? I'd rather not RMA the card, since it means likely waiting another year for one, let alone removing the waterblock and re-installing the original fan system. But everything I've read online suggests there is a fix, but I have yet to find it. Does anybody have any ideas??? Little bit pissed after waiting all this time, dropping some coin on a paperweight.


Update the BIOS on the card, or search for the NVIDIA DisplayPort updater thing, I think some have said that fixed it too. I don't have a link to it or I'd give it to you. If you have a monitor that is "high refresh rate" or has an "OC" mode, turn it off on the monitor and try again. It's caused by the higher refresh rates. The monitor I had at the time would work fine if I turned the OC mode on the monitor off. I can't remember if it was 160Hz or 144Hz.

You could grab a BIOS off of TechPowerUp for your GPU and flash it. I'd look for one that is newer than the one you're running now. I know that's what I did to fix it, and I had an early 3080TI when no one had any idea what was going on. That was a fun 2 days..

Either way, don't RMA the card over it. Very fixable! It's just a simple initialization miscommunication with DisplayPort on the card and your motherboard.


----------



## Imprezzion

acoustic said:


> Update the BIOS on the card, or search for the NVIDIA DisplayPort updater thing, I think some have said that fixed it too. I don't have a link to it or I'd give it to you. If you need to get into the BIOS, for now you will have to use an HDMI cable - that's what I had to do, then I figured out it was the GPU BIOS causing the issue, so just flashed to a newer one off of TPU. Easy peasy.
> 
> You could grab a BIOS off of TechPowerUp for your GPU and flash it. I'd look for one that is newer than the one you're running now.
> 
> Either way, don't RMA the card over it. Very fixable!h It's just a simple issue caused by an initialization miscommunication with DisplayPort on the card and your motherboard.


Yeah I just saw jays2cents vid about it last week. Hopefully it's the same problem.


----------



## Soulpatch

Never updated the bios before. I've tried using a lower refresh rate monitor and even an hdmi connection instead of the DP. This is getting extremely frustrating. I keep finding info on bios updates, but nothing that is actually helpful. Everything is outdated, how do I update once I download the bios? Okay, found the ROM file, nvflash 64. How do I use them? Is there anything specific I should know? I found the port update tool, but it seems pretty old? I've done a lot of custom stuff, but gpu's I've never really touched. Directions I've found are simple at best to say the least. So no idea how to update it correctly and I don't want to "just do it" because last thing I need is to brick the damn thing. Why after 2 years of release time is this still an issue? I'm really at a loss and it's not only starting to frustrate the **** out of me, but now I'm to the point where I just want to box the damn thing up, take a bat and actually send it via impact back to them. And of course due to covid they have zero tech support during the weekend. Only crap I've been able to find online you need to be a level one hacker, or data is so old it doesn't even apply. One guy just posted a picture and said "do this". Could really use some actual help.


----------



## acoustic

Soulpatch said:


> Never updated the bios before. I've tried using a lower refresh rate monitor and even an hdmi connection instead of the DP. This is getting extremely frustrating. I keep finding info on bios updates, but nothing that is actually helpful. Everything is outdated, how do I update once I download the bios? Okay, found the ROM file, but what do I need to install it?


First page of this thread, download NVFlash



How-To Flash RTX Video Card BIOS To A Different Series



This guide is a little dated but it's fairly simple and explained quite well. You want to backup your stock BIOS with GPUZ just to be safe, and then run NVflash as the guide will direct you.

I think what happened is the GPU you received is one of the early releases that do not have the BIOS updated yet. It happens.

Alternatively, you could email EVGA. They send out batch files that run on their own to update the BIOS if I'm not mistaken.

Oh, you might be able to get it from EVGA PRECISION X1. I think that will auto-detect your card and push a newer BIOS to you. Download Precision X1 here: https://www.evga.com/precisionx1/

I would uninstall Precision after and use Afterburner for your tweaking, though. Precision sucks, but it does push BIOS updates for EVGA cards if I remember right.

If you found the DP update tool, should be as simple as just running the executable and then rebooting. If it worked, you should have signal once the system POSTs and see your splash screen prior to Windows log-in.

The way to flash .. as I'm on my phone and it's a PITA to type all this, and I'm repeating this off memory...

Go to device manager and disable the GPU

Put NVFLASH and the BIOS file you downloaded on the root of your C Drive (or whatever letter your OS drive is). Make a folder named nvflash and drop the nvflash64 and BIOS file you want to use in that folder.

Type in "cmd" in windows search bar and launch "command prompt" .. type

Cd..
Dir

Make sure you're on the root of the C drive (should just see C:/) .. if not, type "cd.." again. now type

cd nvflash
nvflash64 -6 thenameoftheBIOS.rom

Yes through the prompts and let it flash.

Reboot

It SHOULD work on that first reboot, but if not, reboot again. If it still doesn't.. well.. might have DL'd another poop BIOS with the DP issue.


----------



## Soulpatch

Thank you, I've looked and searched and couldn't find anything even remotely as descriptive. I'll do this and see if it helps. Really cheeses me off because it's not like I'm a computer idiot. Been building custom pc's for 30 years and this is the first time I've ever encountered anything like this.


----------



## acoustic

Soulpatch said:


> Thank you, I've looked and searched and couldn't find anything even remotely as descriptive. I'll do this and see if it helps. Really cheeses me off because it's not like I'm a computer idiot. Been building custom pc's for 30 years and this is the first time I've ever encountered anything like this.


I hope your FTW3 isn't one of the early batches with the power balancing issues  that would be my bigger concern. My original 3080FTW3 from release died 2 months ago. Turned PC off one day, turned it on and the card said "yeet".


----------



## Soulpatch

Well, went through and did it old school dos commands to update it. Rebooted no problems with the monitor plugged into the DP on the card. Unfortunately as soon as I went into device manager and enabled the card, it went from the horrible "safe mode looking screen" to the full screen...then promptly dropped the signal. So I know the card actually works, or at least will support video. Just no idea what the hell is going on otherwise. Used the precision X1 (didn't even know about that, always used MSI). It updated, but didn't make a damn bit of difference. Get the feeling it's going to be another very long wait for an RMA... At a loss for what to do next. Really hate to take it all back apart (liquids) to put the 1070 back in...but what the hell is the point in paying $1500 for a card that isn't going to work? The z390 board doesn't allow me to turn on/off onboard graphics, so the only way to disable the onboard is in windows itself. Which doesn't actually help any. Was trying to think of any possible settings either in the uefi or windows that would make a difference. The card is detected, but as soon as it's enabled flushes itself right down the ****ter. If you have any other ideas may as well try them. Will evga rma a card that I had to take apart to install the waterblock? Have all the parts still, just didn't want to find out I'm going to eat the cost of it on top of everything else. This day just keeps getting better and better.


----------



## acoustic

EVGA does not void warranty for removing the cooler, actually a big perk.

Your card should have dual BIOS if it's a FTW3. Shut the PC down, and flip the switch on the card to the other BIOS - you might go back to having the problem you had before (no signal until Windows boot) but should get you back into Windows. Please link what BIOS you used to flash the card as well.

You could also, if you have the space, install the 1070 in your lower PCIE slots to get signal in Windows to see what code the 3080 is throwing.

Could just be the NV driver flipped out when you re-enabled the card in device manager and she needs a clean out. Typically disabling the card in device manager will stop any driver issue from happening, but nothing is ever a guarantee when it comes to this ****.


----------



## Soulpatch

No dual bios...it's the 3080ti XC3 ultra. Got the 3080ti bios at techpowerup, along with nvflash (maybe I got the wrong ones?) They do a blanket file that's supposed to cover so many models, but sometimes they don't. It's a possibility I guess. Is there a better place to get them? Maybe I need to do that, try somewhere else. Thought they were the latest bios, but just went in to search and can't even find the Ti now, just the 3080 blanket versions. Already cleaned out the nvidia driver before starting this whole mess. Even tried installing an older version at one point, just to see. Unfortunately it's vertically mounted, not horizontal. So adding the 1070 back in, even to test it would be a serious pia. At least it doesn't void the warranty if I end up having to ship this pos back. Just wish I could figure it out. 2 years of waiting, moved 900 miles and now this.


----------



## acoustic

VGA Bios Collection: EVGA RTX 3080 Ti 12 GB | TechPowerUp 

This is the one you should have used for your model.


----------



## Soulpatch

That's the one I have. We went and saw the royals play, so while out I took out the battery, unplugged, etc. Just to see if it would do anything when booting back up. Think I'm just screwed no matter how you look at it. Going to try to flash the bios one last time and then it just is what it is. I'll have to see if I can get ahold of somebody from evga and hope for the best. Really do appreciate the help, gave me some options I didn't think about. If something pops into your head, throw it up and I'll give it a shot. Or if by some freak chance I manage to get it all working... I'll post up what actually did it.


----------



## acoustic

Soulpatch said:


> That's the one I have. We went and saw the royals play, so while out I took out the battery, unplugged, etc. Just to see if it would do anything when booting back up. Think I'm just screwed no matter how you look at it. Going to try to flash the bios one last time and then it just is what it is. I'll have to see if I can get ahold of somebody from evga and hope for the best. Really do appreciate the help, gave me some options I didn't think about. If something pops into your head, throw it up and I'll give it a shot. Or if by some freak chance I manage to get it all working... I'll post up what actually did it.


Sorry brother, wish I had the answer for ya. Best of luck.


----------



## Soulpatch

Ya, I don't get it either. It flashes fine, reboots with the card (cable in the port). But as soon as I activate it in device manager, it will change the video setting and then crash shortly after. If I don't activate it, it works. Well, sort of considering the quality looks like something from the 80's. But ya, have no ideas anymore. Deactivate the card, flash the bios, reboot with the monitor plugged into the card and it's fine until that point of reactivating it. So ya, I got nothing at this point.


----------



## acoustic

Soulpatch said:


> Ya, I don't get it either. It flashes fine, reboots with the card (cable in the port). But as soon as I activate it in device manager, it will change the video setting and then crash shortly after. If I don't activate it, it works. Well, sort of considering the quality looks like something from the 80's. But ya, have no ideas anymore. Deactivate the card, flash the bios, reboot with the monitor plugged into the card and it's fine until that point of reactivating it. So ya, I got nothing at this point.


Have you tried flashing a different BIOS? There was a different XC3 BIOS on there from what I briefly seen.

****, at this point.. I'd try flashing an ASUS TUF BIOS to it. As long as it's a 2x8pin BIOS, you're good to play around.


----------



## Soulpatch

Tried another one that was pretty close to the make/model. But didn't want to get too nuts. But ya, you're right, at this point may as well try what I can. But also want to make sure I don't brick it. Just too damn frustrated at this point. Either it's going to be an RMA, or the evga guys will send me a file that will magically fix it all. What I don't get is all i've read is that there are literally thousands of these cards all having the same issue...did somebody drop the ball just like the initial release? Why no direct fixes? The average computer user wouldn't have the foggiest idea where to even begin to fix this. At this point even I'm stumped and feel dumb as a rock as to what's causing the issue, or even how to fix it.


----------



## Soulpatch

Ordered an 8k cable, but if that solves the issue, will my other monitors work then? Running an MSI ultra wide as the main @ 144hz, 32" led tv (hdmi to DP adapter) and a smaller screen (hdmi) that is used to show sensor points. I'll be a bit more pissed if they no longer work and the whole monitor setup is now worthless.


----------



## acoustic

Soulpatch said:


> Ordered an 8k cable, but if that solves the issue, will my other monitors work then? Running an MSI ultra wide as the main @ 144hz, 32" led tv (hdmi to DP adapter) and a smaller screen (hdmi) that is used to show sensor points. I'll be a bit more pissed if they no longer work and the whole monitor setup is now worthless.
> View attachment 2556766


If it turns out to be a cable issue, I'm going to apologize and also throw myself off a cliff for recommending you flash your GPU BIOS before something much more simple. LOL. I had no idea you were using multi-monitors. My suggestion in that case is *always* go down to one monitor plugged in without any adapters. I assume you did that for troubleshooting purposes?

Cute kitty btw!


----------



## Soulpatch

lol Oh ya, I disconnected all of them but the main. The only time I tested any other monitor it was the LED tv @ 60hz and it was the only one connected. Just so I could try the hdmi port on the card. But at this stage it's a last ditch hail marry attempt at trying to get it to function correctly. But I don't have high hopes for it, think it's mostly because I know the EVGA tech will likely ask. You know like "did you turn it on/off, remove the battery, etc etc etc". I'm thinking odds are good unless he has some miracle secret fix, it'll be an RMA at this point. I did try other DP cables as well, but they were only 4k (1.2) and not the 8k (1.4). Apparently it's a potential issue? But like I said before, if that's the issue, then that means monitors across the country would be bricks at this point. How many people are running this card with a standard monitor?


----------



## fray_bentos

Soulpatch said:


> lol Oh ya, I disconnected all of them but the main. The only time I tested any other monitor it was the LED tv @ 60hz and it was the only one connected. Just so I could try the hdmi port on the card. But at this stage it's a last ditch hail marry attempt at trying to get it to function correctly. But I don't have high hopes for it, think it's mostly because I know the EVGA tech will likely ask. You know like "did you turn it on/off, remove the battery, etc etc etc". I'm thinking odds are good unless he has some miracle secret fix, it'll be an RMA at this point. I did try other DP cables as well, but they were only 4k (1.2) and not the 8k (1.4). Apparently it's a potential issue? But like I said before, if that's the issue, then that means monitors across the country would be bricks at this point. How many people are running this card with a standard monitor?


Have you tried using a non-ultrawide screen as your main (e.g. a TV over HDMI)? Could just be a resolution switching issue.


----------



## Soulpatch

If you read through the posts...yep. Even did an HDMI port attempt (low frequency refresh rate), since some posts thought that it may be a cause. Although I'd find that weirdly ironic since it's a card designed to do nothing BUT high end video. Went through and updated any non essential through windows, double checked all the drivers/bios for the motherboard, etc. Everything is current and up to date in reference to that. So at a loss currently.


----------



## Soulpatch

Made some progress??? Okay, have been chasing this thing for days now. But, seem to have made some progress...I think. I can't get the main monitor to run on the DP on the card...but, somehow I've managed to get the other two monitors to run off of it. The 32" ledtv is on one of the DP ports for the card, the small heads up display is on the hdmi port on the card. BUT still can't get the main monitor to come up for longer than a minute before it goes blank and get the dreaded "dp no signal". It also crashes the other two monitors. Tried multiple combinations of turning off the motherboard gpu, etc. but doesn't seem to make a difference. I'm starting to wonder if there is something in relation to the frequency of the monitor or even the port characteristics (1.2 vs 1.4)? It's not much, but at least it's something. The 8k cable comes tomorrow so will swap it out first thing (likely while waiting on hold with EVGA). Maybe something in the card has an issue with detecting high refresh rates and tries to force it through the cable instead of altering itself to match what is there. But wouldn't more people have the same issues? Or why doesn't it seem to have a problem running a 32" led tv and a small 5x10" led? Both of which are running 60hz refresh rates and 1920x1080? There has to be an answer and really hoping that evga knows what it is....


----------



## acoustic

Very strange. Could very well be the cable..


----------



## Soulpatch

Apparently not as much progress as I thought? Talked to an EVGA tech and described the whole situation and he thinks something is wrong with the board. So doing an RMA whether I want to or not. I have a new cable supposed to be here today, so will likely throw it on just for fun to see. But it wouldn't make sense if it did.


----------



## Soulpatch

Soulpatch said:


> Apparently not as much progress as I thought? Talked to an EVGA tech and described the whole situation and he thinks something is wrong with the board. So doing an RMA whether I want to or not. I have a new cable supposed to be here today, so will likely throw it on just for fun to see. But it wouldn't make sense if it did.


Update: Cable didn't do anything. So it has to be something board related. The fact that two monitors function and the third won't leads me to believe there is something wrong with the bios/firmware, but even after updating it didn't change anything. So if anybody else has been following with the same issue, you'll likely have to RMA it. Going to be a little over a week (I paid the cover deposit) to get the new one, but the alternative is 3-4 weeks otherwise.


----------



## zebra_hun

New record in Shadow of the Tomb Raider Benchmark. I'm happy with it.
1440 Highest.

















DLSS enabled, quality:


----------



## Panchovix

Got a little better scores in TimeSpy (graphics) and Port Royal, ReBAR forced.
Man I wish I had better RAM, so it wouldn't wreck my CPU score with ReBAR enabled lol.



















Without ReBAR, I get this









Which netted me 14th place at least with my CPU/GPU combo


----------



## yzonker

Panchovix said:


> Got a little better scores in TimeSpy (graphics) and Port Royal, ReBAR forced.
> Man I wish I had better RAM, so it wouldn't wreck my CPU score with ReBAR enabled lol.
> 
> View attachment 2558272
> 
> 
> View attachment 2558273
> 
> 
> Without ReBAR, I get this
> 
> View attachment 2558274
> 
> Which netted me 14th place at least with my CPU/GPU combo
> View attachment 2558277


The 5800x3D takes a very small hit in the TS CPU test with reBar forced. Although if you are following that thread you know that I've been struggling to match my 5800x PR score. But I slightly beat my previous 5800x graphics score in TS with the 5800x3D (cpu was still a little lower).


----------



## Falkentyne

Audioboxer said:


> I'll have a look then.
> 
> I sent a support ticket to EVGA with my proof to ask for support and also check if there are any other BIOS files to try.
> 
> In the meantime I will have a look on techpower and see if there are other LHR 3080 cards with 450w+ BIOS. Who else makes a 3 pin card?
> 
> 
> 
> Here is quite a clear example
> 
> View attachment 2555290
> 
> 
> Coming in just under 400w, maintains 2100mhz at 1.0v.
> 
> View attachment 2555291
> 
> 
> Set to run at 2160 @ 1.05v, pegs at 400w, frequency craters to 2010mhz.
> 
> Given I've seen this
> 
> View attachment 2555292
> 
> 
> Which looks like a small spike to 409.3w, looks to me like software is fine. There is something wrong with the card/BIOS. There are other posters on the EVGA forums complaining the LHR bios either doesn't go above 400w, or one person mentioned 408w.


The card hits a power limit because one of the 8 pins reaches 150W, which IIRC is the "100%" Normalized limit for any of the 8 pins. Any rail or sub rail hitting its normalized limit causes normalized TDP to report 100%, which triggers a power limit even if total TDP is nowhere near 100%. GPU-Z won't show normalized limit values (only the rail with the highest normalized limit relative to its "base" value gets reported--this is similar in function to GPU "memory junction temp" or "Hotspot"); you need HWinfo64 to see TDP Normalized%.

Notice that 8 pin #3 only reaches 71.8W. I haven't been paying attention to this thread but I've only seen that on FTW3 cards.


----------



## joyzao

Hi guys,

xoc bios for rtx 3080 12gb ftw3 ultra exist?


----------



## zgigi666999

Wondering what i can do to boost performance on my 3080 FTW3 Ultra 12gb i see the bios on the card is already 450w not sure what can be done since my temp is very good when gaming like 54c are fan speed of 70% and memory temps are at 66c while gaming for 1h straight. I wanted to push my car further to squeez more performance out of it. the problem i only got a 8700k and 750w power supply but i never had any reboot while gaming


----------



## Nizzen

joyzao said:


> Hi guys,
> 
> xoc bios for rtx 3080 12gb ftw3 ultra exist?


Not in public. Shuntmod is your only option.


----------



## mouacyk

zgigi666999 said:


> Wondering what i can do to boost performance on my 3080 FTW3 Ultra 12gb i see the bios on the card is already 450w not sure what can be done since my temp is very good when gaming like 54c are fan speed of 70% and memory temps are at 66c while gaming for 1h straight. I wanted to push my car further to squeez more performance out of it. the problem i only got a 8700k and 750w power supply but i never had any reboot while gaming


Looks like you already have power limit headroom, so the only way to get more performance is get temps below 40C. If you can do so, you will gain about 45MHz that was lost due to temp throttling. If you haven't already done so, you can tweak your freq/volt curve to get optimal baseline performance -- this one depends on silicon quality and how well tuned the stock curve is already.


----------



## Panchovix

mouacyk said:


> Looks like you already have power limit headroom, so the only way to get more performance is get temps below 40C. If you can do so, you will gain about 45MHz that was lost due to temp throttling. If you haven't already done so, you can tweak your freq/volt curve to get optimal baseline performance -- this one depends on silicon quality and how well tuned the stock curve is already.


Can confirm at least, I'm still on Air on my shunted TUF (just found there is an AIO for this card Alphacool Eiswolf 2 AIO - 360mm RTX 3080/3090 TUF mit Backplate though 250 Euros is something I can't afford right now for an AIO lol), testing the good ol' 3DMark 05, fans at 100% the card was at 32°C lol, the clocks were at 2160Mhz without much issues.

Instead, on TimeSpy for example, which obviously asks way more (using 470-500W or so) the cards get up to 50-55°C and so it settles at 2115Mhz or so.
Temps does affect the clocks heavily.


----------



## Soulpatch

acoustic said:


> If it turns out to be a cable issue, I'm going to apologize and also throw myself off a cliff for recommending you flash your GPU BIOS before something much more simple. LOL. I had no idea you were using multi-monitors. My suggestion in that case is *always* go down to one monitor plugged in without any adapters. I assume you did that for troubleshooting purposes?
> 
> Cute kitty btw!


Okay, it definitely wasn't the cable. Got the new card in finally, sending the old one back. Tested it air cooled to make sure it would post and it worked perfectly. Installed the waterblock and put it all back together and here I am. Updated the bios using EVGA's software and it smoothed out quite a bit. Apparently they have a major issue with that. Still need to do some tweaking and maybe some overclock, but I want to get it all settled first before going too deep. Will the evga software update to the best option of bios or should I do an upgrade using NvFlash? Wasn't sure if they were the same/but named differently. 
Thanks, for all the help. Really appreciate the feedback/knowledge. No one person can know everything and honestly, I've never really done anything GPU related when it came to flashing bios or even having issues with them. My 1070 ran fantastic and aside from a few software updates and a little overclocking through MSI, never had an issue with it. Sorry to see it go, but likely going to post it up for sale with the waterblock. So if anybody is looking, shoot me a pm. I'm reasonable and not looking for top dollar. The cat's name is Joey and pretty sure he's going through teenage years....


----------



## zgigi666999

mouacyk said:


> Looks like you already have power limit headroom, so the only way to get more performance is get temps below 40C. If you can do so, you will gain about 45MHz that was lost due to temp throttling. If you haven't already done so, you can tweak your freq/volt curve to get optimal baseline performance -- this one depends on silicon quality and how well tuned the stock curve is already.


should i use msi afterburner or evga i got an ftw 3 ultra 3080 not sure what i should use to overclock with the curve


----------



## mouacyk

zgigi666999 said:


> should i use msi afterburner or evga i got an ftw 3 ultra 3080 not sure what i should use to overclock with the curve


probably get most help with MSI afterburner on here

here's my max, sub-ambient, curve (notice it hits stable points at known voltage points like 1.0, 0.9, 0.85 etc):









here's my efficient curve that tops out at 0.9v:


----------



## andrew149

zgigi666999 said:


> should i use msi afterburner or evga i got an ftw 3 ultra 3080 not sure what i should use to overclock with the curve


Msi afterburner just works.


----------



## zgigi666999

andrew149 said:


> Msi afterburner just works.


Thanks question will nvidia rebarr still works if i use msi afterburner ?


----------



## mouacyk

zgigi666999 said:


> Thanks question will nvidia rebarr still works if i use msi afterburner ?


yes


----------



## zgigi666999

mouacyk said:


> yes


how do i know if it's activated or not ?


----------



## Falkentyne

zgigi666999 said:


> how do i know if it's activated or not ?


GPU-Z.


----------



## Tobe404

I caved and got an ASUS Tuf OC 3080 two days ago

Very impressed with the 3080 Tuf cooling and noise levels

Still tinkering with UV and OC but so far it's at

2100 / 1150 OC at 1.031v power limit 110%

Hyperthreading disabled on 10850k.

Firestrike run


----------



## S4squatch

Noctua edition of 3080 coming soon.








A Finnish retailer has listed it for 1149,90€. I wish I had know this a mont earlier. I bought the Noctua 3070 thinking there will be no more powerfull cards with this cooler. Dang!


----------



## Panchovix

S4squatch said:


> Noctua edition of 3080 coming soon.
> View attachment 2559830
> 
> A Finnish retailer has listed it for 1149,90€. I wish I had know this a mont earlier. I bought the Noctua 3070 thinking there will be no more powerfull cards with this cooler. Dang!


DAMN, even though I have my TUF for like 1 year and half, I would easily get that 3080 Noctua if it was available here in Chile lmao, I loved that card, was a bummer only the 3070 had that cooler.


----------



## nikoli707

S4squatch said:


> Noctua edition of 3080 coming soon.
> View attachment 2559830
> 
> A Finnish retailer has listed it for 1149,90€. I wish I had know this a mont earlier. I bought the Noctua 3070 thinking there will be no more powerfull cards with this cooler. Dang!


here is mine


----------



## opheen

Results after some some runs with my EVGA RTX3080 FTW3 Ultra 10gb With cheap Byksi waterblock . Pretty happy with the results! So its not the last time i buy å Bykski block for sure.


----------



## Imprezzion

opheen said:


> View attachment 2559861
> 
> View attachment 2559864
> 
> 
> View attachment 2559862
> 
> View attachment 2559863
> View attachment 2559861
> 
> 
> Results after some some runs with my EVGA RTX3080 FTW3 Ultra 10gb With cheap Byksi waterblock . Pretty happy with the results! So its not the last time i buy å Bykski block for sure.


Bykski makes great blocks for sure. My Gigabyte Gaming OC has been running one for 6 moths now and it's never been above 50c core 60c hotspot. Memory temps are pretty bad tho cause the included thermal pads are not great. 72c ish for me at +1200.


----------



## opheen

Imprezzion said:


> Bykski makes great blocks for sure. My Gigabyte Gaming OC has been running one for 6 moths now and it's never been above 50c core 60c hotspot. Memory temps are pretty bad tho cause the included thermal pads are not great. 72c ish for me at +1200.


I did not use the pads that came with the block, used some 20 W/mK 1mm - putty around memory and i stacked 8mm with pads too cool the chokes. and some 1mm on backside even if you dont have to . my mem temps is 40 to 55\60c


----------



## opheen




----------



## Tobe404

Managed to crack the #1 spot for a 10850k / 3080 combo in Australia or top 6% overall.

















So close to 33k.

I don't know why but the combination score is always higher with HT disabled and lower when it is enabled?


----------



## Panchovix

Tobe404 said:


> Managed to crack the #1 spot for a 10850k / 3080 combo in Australia or top 6% overall.
> View attachment 2559892
> 
> 
> View attachment 2559893
> 
> So close to 33k.
> 
> I don't know why but the combination score is always higher with HT disabled and lower when it is enabled?


Pretty nice, but I think you can do better using the 30.0.14.9676 driver, since for some reason it gives a pretty big jump on FireStrike (5XX drivers helps a bit, but not like 496.76); this is my one on my TUF + 5800X.

About HT, it may happen, it is mostly an "issue" with more than 8 cores (so 10 or more if I'm not wrong) on FireStrike only, on TimeSpy it shouldn't affect it.
















I scored 38 518 in Fire Strike


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 3080 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Tobe404

I think the 496.76 drivers seem to improve a lot of things not just firestrike.

Nvidia Driver: 496.76 vs 511.23 - God of War [New Driver=Less FPS] - YouTube

A lot of comments saying they lost 10-20FPS in games with the newer driver,

Edit: So changing drivers to 496.76 and 9c/18t resulted in this








Going up to 10c/20t tanks the combined score.

I did crack 47000+ graphics score when testing core / thread count. But that was at 8c/8t by accidernt so the physics and overall score was way down.

Hopefully I can crack it again sometime with 9c/18t enabled

Edit 2: Went back to 512.59 drivers. Combined score tanks even on 9c/18t. So it appars it's a driver and core/thread count issue.


----------



## Panchovix

Tobe404 said:


> I think the 496.76 drivers seem to improve a lot of things not just firestrike.


At least on the benchmarks side, it helps a lot on FireStrike, but it gets a performance decrease on TimeSpy and Port Royal vs 472.12 or 466.63 driver.

And yeah, lately have been using 5xx drivers and they aren't good IMO, probably will downgrade to 496.76 or 472.12, depending of what I will play lol


----------



## opheen

Tobe404 said:


> I think the 496.76 drivers seem to improve a lot of things not just firestrike.
> 
> Nvidia Driver: 496.76 vs 511.23 - God of War [New Driver=Less FPS] - YouTube
> 
> A lot of comments saying they lost 10-20FPS in games with the newer driver,
> 
> Edit: So changing drivers to 496.76 and 9c/18t resulted in this
> View attachment 2559906
> 
> Going up to 10c/20t tanks the combined score.
> 
> I did crack 47000+ graphics score when testing core / thread count. But that was at 8c/8t by accidernt so the physics and overall score was way down.
> 
> Hopefully I can crack it again sometime with 9c/18t enabled
> 
> Edit 2: Went back to 512.59 drivers. Combined score tanks even on 9c/18t. So it appars it's a driver and core/thread count issue.


----------



## opheen

I used 512.15 Graphics score 49086


----------



## zebra_hun

Tobe404 said:


> Managed to crack the #1 spot for a 10850k / 3080 combo in Australia or top 6% overall.
> View attachment 2559892
> 
> 
> View attachment 2559893
> 
> So close to 33k.
> 
> I don't know why but the combination score is always higher with HT disabled and lower when it is enabled?


Good to see my scores, lol. 10850k is great cpu. Oc'ed ram, cpu and gpu 















bf.derertcombat its me 
Secret is uncore 5100MHz, and ram oc. Used fixed voltage and frequenz. Look at third session.


----------



## Tobe404

zebra_hun said:


> Good to see my scores, lol. 10850k is great cpu. Oc'ed ram, cpu and gpu
> bf.derertcombat its me
> Secret is uncore 5100MHz, and ram oc. Used fixed voltage and frequenz. Look at third session.


I wouldn't be game to run that high a voltage on my 10850k to be honest. Maybe i'm not seeing it but what are the temps like? For CPU and GPU,


----------



## zebra_hun

Not much. My daily setting is x50/[email protected]
Cpu under load 1.10V for allcore 5GHz.
This is Forza 5 Benchmark:
FH5 BM
This is a stability test:
P95 non avx + Furmark
Infos are there.


----------



## Panchovix

opheen said:


> I used 512.15 Graphics score 49086


Which core-mem overclocks?


----------



## opheen

Panchovix said:


> Which core-mem overclocks?


1,297 MHz when i run 2190Mhz and 1,313 when i run att 2145Mhz


----------



## Panchovix

opheen said:


> 1,297 MHz when i run 2190Mhz and 1,313 when i run att 2145Mhz


I'm envy of those 2190Mhz lol, on benchs my stable is 2160Mhz, and 2130Mhz on games


----------



## Tobe404

opheen said:


> I used 512.15 Graphics score 49086


Is yours Shunt modded?


----------



## opheen

Tobe404 said:


> Is yours Shunt modded?


No its only the 450w XOC bios and Liquid metal on the GPU die 20Wm\k pads. Bykski waterblock.


----------



## Panchovix

Tobe404 said:


> Is yours Shunt modded?


Shunt mod will only get you at most 450-500W in 2x8 pin cards basically.

On 3x8 pin cards, just using the 450W XOC VBIOS will be enough.


----------



## leven

Hello.
I have a request for the initiated.
I have a Zotak RTX3060Ti card.
The card's bios offers 200Wat PL, which causes the GPU clock to drop to 1870MHZ and the voltage to 0.910mV while playing.
Please, modify the bios in the ABE program.
Values to be entered:

GPU Clock: 2000MHZ
Mem Clock: 8000MHZ
Vgpu: 0.980mV
PL: 280Wat.

Thank you - leven.


----------



## mouacyk

Not for a million dollars, or more.


----------



## opheen

Hey 
is there anyone else in here with the Gigabyte AORUS WaterForce Xtreme WB RTX3080 10gb?
I came across such a card unused at a good price to have to say I am impressed with the temperatures and performance despite the 370W TDP. 2025 Mhz @ 0.944v not even Warzone manages to get the card unstable!


----------



## zebra_hun

UE5 Bench:
UV OC:









Opheen:

This is non Aorus WB. 
This is my card.


----------



## masscrazy

Question if you might - I want to build a custom loop in the meshlicious. Have a few 3080 options.

*Asus TUF 3080 OC V2* or *MSI 3080 Ventus 3x OC Plus* - which, if anyone knows is better for watercooling or better card in general. The Ventus is great with deshrouding (may do in short term).


----------



## opheen

masscrazy said:


> Question if you might - I want to build a custom loop in the meshlicious. Have a few 3080 options.
> 
> *Asus TUF 3080 OC V2* or *MSI 3080 Ventus 3x OC Plus* - which, if anyone knows is better for watercooling or better card in general. The Ventus is great with deshrouding (may do in short term).


Personally, I like Asus and the TUF card has a good reputation when it comes to temperatures. But if the plan is to put on a water block, then buy the cheapest, it is wise to check out that there are water blocks available for the card you are going to use and what the total price is, GPU + Block so you see what it ends up in total. That's my opinion. Both cards are 2x8pin.


----------



## masscrazy

opheen said:


> Personally, I like Asus and the TUF card has a good reputation when it comes to temperatures. But if the plan is to put on a water block, then buy the cheapest, it is wise to check out that there are water blocks available for the card you are going to use and what the total price is, GPU + Block so you see what it ends up in total. That's my opinion. Both cards are 2x8pin.


Yea my aim is not extreme overclocking, some would be nice, otherwise its to have a quiet unobtrusive system. 

So deshroud or watercooled is what I am thinking. And there are waterblocks from most companies (except EKWB) for the Ventus available. They are pricey, especially the corsair one, for £159.


----------



## opheen

masscrazy said:


> Yea my aim is not extreme overclocking, some would be nice, otherwise its to have a quiet unobtrusive system.
> 
> So deshroud or watercooled is what I am thinking. And there are waterblocks from most companies (except EKWB) for the Ventus available. They are pricey, especially the corsair one, for £159.


If you go for hoses and pumps from for example Barrow and 2x240mm radiators or 1x240mm and 1x120mm then the sum on a complete water cooling system will not be so bad, and when it comes to water block I would again go for the cheapest, even I have had from EK -Alphacool-Barrow-Bykski. Bykski blocks tend to have good price and included back plate.

EVGA ftw3 3080 or Asus ROG Strix 3080 is what I would choose if you really want to take it all out when it comes to overclocking, as they have vBios with 450w TDP easily accessible, MSI suprim \ Gigabyte AORUS Master are not bad either. All mentioned cards have 3x8pin power connectors.
The reason why I recommend EVGA and Asus is that if something should happen to the graphics card, they have a good guarantee and make a possible complaint as simple as possible. Certainly divided opinions about it but that is at least my experience.


----------



## opheen

Here is my pc with EVGA ftw3 3080 and Bykski block Mods Rigs This pc i dont care how it looks inside, Build for max performance in a ITX and the sidepanels is mesh kind a on both sides.


----------



## masscrazy

opheen said:


> Here is my pc with EVGA ftw3 3080 and Bykski block Mods Rigs This pc i dont care how it looks inside, Build for max performance in a ITX and the sidepanels is mesh kind a on both sides.


Nice work. I plan to do custom loop inside the meshlicious. 

At this moment I have a new evga ftw 3080 but returning it. It cost £900 a cool 150 more than the ventus which I'm considering. Thing is how much more fps will I get with a max overclock with the ftw, 20fps average more. Could be significant or could reduce one settings notch and get that deficit back up. 

So like you said before, cheapest 3080 under water is best value option. Will wait over public holiday weekend see if any deals to be had.


----------



## opheen

masscrazy said:


> Nice work. I plan to do custom loop inside the meshlicious.
> 
> At this moment I have a new evga ftw 3080 but returning it. It cost £900 a cool 150 more than the ventus which I'm considering. Thing is how much more fps will I get with a max overclock with the ftw, 20fps average more. Could be significant or could reduce one settings notch and get that deficit back up.
> 
> So like you said before, cheapest 3080 under water is best value option. Will wait over public holiday weekend see if any deals to be had.


1440p rly shows the difference between a 2 and 3x8 pin card, i would go for the cheapest if you going for water anyway, but if you decide to go with only air then I would include how good cooling there is in the decision, and if there is not that big a difference between the cheapest and one with dualbios then it may be worth going up a bit in price for to get one with dualbios, if you want to flash a vbios with higher TDP, then it is much easier to be able to fix again if you are unlucky with a bad vbios or flash.
the asus TUF card have dual bios. Msi Gaming z trio have 3x8 and dropped alot in price lately.


----------



## Audioboxer

This BIOS has fixed my 400w power draw wall for my 3080 FTW3 10GB LHR VGA Bios Collection: MSI RTX 3080 10 GB | TechPowerUp

It's 430w, but it works! Power draw reporting in normal apps like GPU-Z/Rivatuner is broken, but I've tested with the HX1000i power in/power out reporting and also Metro Exodus, with it now allowing me to maintain 2100mhz with +1500 on the memory at 1.050v and it doesn't drop the core clock. Card is also running hotter again showing more power draw. Metro getting the card up to 42 degrees now with it previously being 37~39 degrees depending on ambient.

So there is hope EVGA could push these cards a bit more with a BIOS fix but with them ignoring all conversations around this vanilla 3080 PCB, I wouldn't be surprised if everyone is stuck using an MSI BIOS on an EVGA card. Not a big issue for most, but having your power draw reporting being broken is annoying, especially for anyone who doesn't have a power supply that can digitally report its usage statistics.

And yes, I've tried both ASUS and Gigabyte 450w LHR BIOS, one of which was tested months ago. The Gigabtye BIOS just stops the EVGA pin 3 functioning completely, even although the Gigabyte card is 3 pin.

*edit* - Further testing, I'm wondering if this BIOS just allows the card to draw near whatever it wants lol. This might be a worry if it's essentially just "broken" power balancing










700w in, 652w out max during Metro. That is quite a bit more than "30w" above the results I'd see around 400w on the GPU being around 600w In - [Official] NVIDIA RTX 3080 Owner's Club

My 5950x has a max PPT of 162w and during playing games, usually just hangs around 135w~150w.

The only thing that has changed in my build since that post above is I ditched all my QL120 fans and replaced my whole case with Noctua NF A12-25s. But I actually ended up with 3 less fans than before. So even the power draw from my fans should be _less_ than before, no LEDs now and 3 fans fewer.










Software power reporting on this BIOS is completely broken, my board is not hitting 378w max, lmao, that is simply impossible. I've got Metro running at 2100mhz at 1.050v with +1500 on the memory and it's staying under power limit because the frequency is holding. Combine that with my power supply power figures and it's clear my card is drawing more than 400w, it's just hard to tell exactly how much more on this 430w MSI bios.

It's entirely possible it's drinking down past even 450w, and closing in on 480~500w. Though, someone can tell me if that's even possible with a 3080 10GB PCB. I think there is some GALAX 1000w BIOS, but that obviously does not mean an unmodified PCB can actually gobble down 1000w.


----------



## Audioboxer

So, back to Kombustor with this Afterburner "Metro" overclock profile










1280x720 can understandably do a decent job holding 2100mhz at 1.050v. Power In/Out at 593w/636w max.










Bump up to 2560x1440 and the frequency is a bit more bouncy, seen it go down to 2055, up to 2085, then back down again as camera pans. Power In/Out at 662w/617w max spikes.

That's just with a minute or so of running on each, will leave for longer. But again, look at this post [Official] NVIDIA RTX 3080 Owner's Club | Page 249 | Overclock.net










Previously when I was pinned at 400w on the 450w BIOS we had Power In/Out at 592w/552w and a frequency of 1995mhzz at 1.031v at 1280x720.

This thing is definitely sucking down more power, and I think, doing some crude maths, it's more than 430w at some of those peaks.


----------



## yzonker

Well it's most likely limiting on 8pin #2. Most of the bios limit to around 175-180w (or less) on the 8pins. Looks like a good find. I wouldn't worry about the borked power readings. Just enjoy your new found power limit!

I tried to calc the power usage a couple of ways. 

First was to base it on my own UPS readings for my system with a 3080ti FTW3. I usually see 650-670w at the UPS. That equates to about 600w system usage given my PSU efficiency. But if your estimate of power usage of your 5950x is accurate, then my 5800x3D uses about 70w less while gaming. That puts it at 670w which is close to what you are seeing. So that would put it in in the 420-450w range depending what my card is really pulling (the 3080ti FTW3 has been shown to underreport by 30w or so, but always reports below 450w, yea that may muddy this up slightly).

The other way is to assume the 8pin #2 reading is fairly close (which it may be, my 3080ti with Galax 1kw bios reads reasonably close only on 8pin #2 and shows the same odd 10v on #1 and #2 for voltage).

Use what I'm assuming is your stock bios power readings for the 8 pins and then factoring up #1 and #3 based on the ratio of #2 (and adding in PCIE slot power). The power balance will stay the same between bios, so the only assumption here is 8pin #2 reading is reasonably accurate.










And like I said over on EVGA forums, the best way to be sure is to buy a decent amp clamp meter and measure each 8pin manually while running a constant load like Kombustor or pausing Heaven.


----------



## Audioboxer

yzonker said:


> Well it's most likely limiting on 8pin #2. Most of the bios limit to around 175-180w (or less) on the 8pins. Looks like a good find. I wouldn't worry about the borked power readings. Just enjoy your new found power limit!
> 
> I tried to calc the power usage a couple of ways.
> 
> First was to base it on my own UPS readings for my system with a 3080ti FTW3. I usually see 650-670w at the UPS. That equates to about 600w system usage given my PSU efficiency. But if your estimate of power usage of your 5950x is accurate, then my 5800x3D uses about 70w less while gaming. That puts it at 670w which is close to what you are seeing. So that would put it in in the 420-450w range depending what my card is really pulling (the 3080ti FTW3 has been shown to underreport by 30w or so, but always reports below 450w, yea that may muddy this up slightly).
> 
> The other way is to assume the 8pin #2 reading is fairly close (which it may be, my 3080ti with Galax 1kw bios reads reasonably close only on 8pin #2 and shows the same odd 10v on #1 and #2 for voltage).
> 
> Use what I'm assuming is your stock bios power readings for the 8 pins and then factoring up #1 and #3 based on the ratio of #2 (and adding in PCIE slot power). The power balance will stay the same between bios, so the only assumption here is 8pin #2 reading is reasonably accurate.
> 
> View attachment 2562755
> 
> 
> And like I said over on EVGA forums, the best way to be sure is to buy a decent amp clamp meter and measure each 8pin manually while running a constant load like Kombustor or pausing Heaven.


Yeah, I responded over there, but I definitely think it's drawing closer to 500w at the highest spikes. It definitely goes in line with the temperature increase I'm seeing on both the core and hot spot.

I'm seeing closer to a +5/6 on the core and potentially +12-15 degrees on the hotspot. Given my 4 rads and cooling, this is going from 37-39 degrees on the core to 42-44 degrees. Hotspot is something like 50s to a max I seen of 67 degrees in HWINFO.

A 30w increase in power given my cooling potential is not causing those jumps IMO.

Still wondering about the other questions I raised on the EVGA forums, such as why does only the MSI BIOS do this and both the ASUS and Gigabyte seem to result in marginally less performance. 

But yeah, I'll just enjoy this power free for all on my card and sit patient hoping someone else with the same card as me tries it out as well. Chances are zero response from EVGA meaning going forward this will be the only unofficial way to get more power.


----------



## Audioboxer

I guess I'll need to reinstall 3DMark and have a poke around at the top end of the voltage scale. Previously had little chance at this as trying to maintain 1.1v at any reasonable frequency almost always ended in very quick power limit and drops.

Ignore the temps above, game only running a few minutes, takes longer for my loop to hit equilibrium due to rad space/volume of water.

More interestingly though is maintaining 2.2ghz at 1.1v

Personally I'm happy with 2.1ghz at 1.050v, and it's Metro Exodus stable. This seems to be the power hungry sweet spot for performance/temps. My 2080Ti needed 1.093v to manage 2.1ghz. Also in Metro 2.1ghz at 1.05v even manages to kiss power limit (which I presume is now anywhere from 480w~500w), so something like 2.2ghz at 1.1v is never going to hold consistently in a game like that with RTing on.

Running a 3080 at 1.1v on the face of it is likely just going to be about bench bragging rights, as I doubt the thermal increase will be worth it for a frequency around 2.17~2.2ghz depending on the games power draw.

Still, further evidence of the power cap being _unrestricted_ now.

edit -










Yup, 5 mins later in a busier scene and temps normalising where I'd expect, low 40s, and frequency comes down a bit due to thermals.










Though it is still happy to bounce back up if scene is less demanding/thermals go down again.

On a related note, Press F for electricity costs for these rumoured new Nvidia cards and what their power draw could be. If I had a 3080Ti I'd probably be less intrigued to push it like this, the 3080 whilst a decent card, still hurts a bit with RTing. That's really the hog, chasing really high core clocks, to me, seems less important than a card having the hardware to handle RTing comfortably above 60FPS and with the goal of approaching 100+ FPS.

A 3080 vanilla still seems like a bit of a "beta test" for functional RTing, it's really going to be outdated fast. It pretty much is beginning to be if you have a high refresh rate monitor. Not that I'm complaining though, it's still a good card and will be for the years to come if you adjust expectations accordingly.










Hotspot tracks a bit higher than it did on the EVGA BIOS, but I would think that is just with a higher power draw. I mean, I used to run the EVGA BIOS to a max voltage of 0.987v, now it's 1.1v lol.


----------



## yzonker

Audioboxer said:


> Yeah, I responded over there, but I definitely think it's drawing closer to 500w at the highest spikes. It definitely goes in line with the temperature increase I'm seeing on both the core and hot spot.
> 
> I'm seeing closer to a +5/6 on the core and potentially +12-15 degrees on the hotspot. Given my 4 rads and cooling, this is going from 37-39 degrees on the core to 42-44 degrees. Hotspot is something like 50s to a max I seen of 67 degrees in HWINFO.
> 
> A 30w increase in power given my cooling potential is not causing those jumps IMO.
> 
> Still wondering about the other questions I raised on the EVGA forums, such as why does only the MSI BIOS do this and both the ASUS and Gigabyte seem to result in marginally less performance.
> 
> But yeah, I'll just enjoy this power free for all on my card and sit patient hoping someone else with the same card as me tries it out as well. Chances are zero response from EVGA meaning going forward this will be the only unofficial way to get more power.


Yea I don't see a way to determine exactly what is wrong with the EVGA bios. It's hitting a predefined limit. The screenshot below is using ABE to view the older 2206 bios for each card (ABE can't read the 2216 unfortunately). I don't see any difference in limits that would explain why the EVGA bios limits early (or why MSI works). The only one that jumps out at me is the VRAM limit of 80w vs 100w on the MSI, but your stock bios screenshot only shows the EVGA bios at 51w (MVDDC).

Although the ASUS bios definitely has the "AUX" limits set a lot lower. Not sure what those are, but that might explain why it does not work while the MSI does.

You can also clearly see the 175w 8pin limit on them all though.

EVGA I'm sure could tweak the bios to work, but they may be concerned with card reliability given all the failures they've had.


----------



## Audioboxer

yzonker said:


> Yea I don't see a way to determine exactly what is wrong with the EVGA bios. It's hitting a predefined limit. The screenshot below is using ABE to view the older 2206 bios for each card (ABE can't read the 2216 unfortunately). I don't see any difference in limits that would explain why the EVGA bios limits early (or why MSI works). The only one that jumps out at me is the VRAM limit of 80w vs 100w on the MSI, but your stock bios screenshot only shows the EVGA bios at 51w (MVDDC).
> 
> Although the ASUS bios definitely has the "AUX" limits set a lot lower. Not sure what those are, but that might explain why it does not work while the MSI does.
> 
> You can also clearly see the 175w 8pin limit on them all though.
> 
> EVGA I'm sure could tweak the bios to work, but they may be concerned with card reliability given all the failures they've had.
> 
> View attachment 2562812


Interesting, thanks!

I guess the question for EVGA then becomes why release a 450w BIOS for a card you are worried about? And then also, why do some of their vanilla 3080s report going up to 450w but others like mine struggle to go above 400w? The lack of any sort of communication is just weird.

Gigabtye also doesn't work, it was one of the first I tried VGA Bios Collection: Gigabyte RTX 3080 10 GB | TechPowerUp Behaves the same as the ASUS.


----------



## Audioboxer

Ambient is getting higher in the UK as the summer kicks off (and my study/office is modest sized, so my case dumping out 700w acts as a nice heater lol), but this is impressive, never had my GPU anywhere near this temp on water before.

More importantly, a spike caught to 708w power in, and this is with Kombustor "only" at 1280x720.

I truly think the card is now pulling up to and possibly even exceeding 500w in spikes. With the CPU around 125w in Kombustor, nothing else can explain the discrepancy. Even if the CPU were underreporting a bit, I have it on PBO defaults. So 142 PPT. So I'd still say we have to be looking at around 500w spikes.


----------



## acoustic

That original FTW3 3080 PCB 'bout to go "bye-bye" lol


----------



## Audioboxer

acoustic said:


> That original FTW3 3080 PCB 'bout to go "bye-bye" lol


lol, they're that bad?

I only got the card a few months ago and it's an LHR model, obviously, so you'd have thought if there were design issues at launch in 2020 they'd have done something by now!

I'll likely daily 2100/1.050v, so that if explodes it's RMA time. Anything above 2100mhz seems to require an exponential increase in voltage. Even 2115 at 1.062v struggles. So likely talking about jumping up to 1.093~1.1v for frequencies above 2100mhz to be happy in all scenarios.


----------



## yzonker

Audioboxer said:


> lol, they're that bad?
> 
> I only got the card a few months ago and it's an LHR model, obviously, so you'd have thought if there were design issues at launch in 2020 they'd have done something by now!
> 
> I'll likely daily 2100/1.050v, so that if explodes it's RMA time. Anything above 2100mhz seems to require an exponential increase in voltage. Even 2115 at 1.062v struggles. So likely talking about jumping up to 1.093~1.1v for frequencies above 2100mhz to be happy in all scenarios.


The only issue is that bios would still be on the card. I'm not sure if EVGA checks for that or not.


----------



## Audioboxer

yzonker said:


> The only issue is that bios would still be on the card. I'm not sure if EVGA checks for that or not.


When my 2080Ti died (a cheap 2 pin blower model because I just wanted the card for watercooling) it went back with an EVGA FTW3 bios on it lol. But I guess that was still an EVGA bios!

EVGA RMA is usually really good. Plus, you just raise a ticket going "I was playing New World and my pc shutdown". Instant "No problem, lets RMA that card!"


----------



## acoustic

Audioboxer said:


> lol, they're that bad?
> 
> I only got the card a few months ago and it's an LHR model, obviously, so you'd have thought if there were design issues at launch in 2020 they'd have done something by now!
> 
> I'll likely daily 2100/1.050v, so that if explodes it's RMA time. Anything above 2100mhz seems to require an exponential increase in voltage. Even 2115 at 1.062v struggles. So likely talking about jumping up to 1.093~1.1v for frequencies above 2100mhz to be happy in all scenarios.


My original 3080 FTW3 one day decided to hit the "red lights of death" and never power on again. That card would hit 480w on software reporting.. never had much issue with power balancing. Must be why it died LOL


----------



## Audioboxer

acoustic said:


> My original 3080 FTW3 one day decided to hit the "red lights of death" and never power on again. That card would hit 480w on software reporting.. never had much issue with power balancing. Must be why it died LOL


lol, EVGA really did disappoint with calling the original PCBs a "FTW3 model".

I'll see how I get on, I've locked it at 2100 1.050v, so more likely power draw just goes ~450w in the heavy load games, rather than what appeared to be spiking to 500w+.


----------



## sisay

I have 3080 12gb palit, GPU temperature reaches 83 C (even though the voltage goes down to 780 mV) During the Furmark test, the clock speed does not exceed 1000mhz (why, I see that people have 1600-1700mhz during the cart) in games and other benchmarks it is at the lower limit of 3080 but it is ok (except for the temperature)
this is normal ?


----------



## opheen

Audioboxer said:


> View attachment 2562815
> 
> 
> View attachment 2562816
> 
> 
> Ambient is getting higher in the UK as the summer kicks off (and my study/office is modest sized, so my case dumping out 700w acts as a nice heater lol), but this is impressive, never had my GPU anywhere near this temp on water before.
> 
> More importantly, a spike caught to 708w power in, and this is with Kombustor "only" at 1280x720.
> 
> I truly think the card is now pulling up to and possibly even exceeding 500w in spikes. With the CPU around 125w in Kombustor, nothing else can explain the discrepancy. Even if the CPU were underreporting a bit, I have it on PBO defaults. So 142 PPT. So I'd still say we have to be looking at around 500w spikes.



EVGA GeForce RTX 3080 FTW3 XOC BIOS - EVGA Forums 450w Bios for your card is here, you have mby tried alrdy ?. my card is able to draw 447w , you have to have a big headroom on PSU, and use 1cable per 8pin from PSU even if it looks like dogshit.. many go with 2 cables from psu and cuz the cable from psu have 2x6+2pin from psu on a single cable, when you rly try to push it you will see difference between 2 or 3 cabels from psu. it will not crash with 2 only but it is more unstable. i am talking about 3x8pin cards with 420w+++ here so no one get me wrong here.


----------



## nevartojau

I am rocking MSI Sea Hawk X RTX 3080. It's LHR 10GB card, hybrid cooled (water and air) anmd it has 2x8pin connectors. I bought it because that was the only GPU available in Norway at the time I was looking for a GPU.

Anyone has any input on a OC approach? It's power limited, so I guess the possibilities are very limited and I haven't seen anyone experimenting with custom BIOS on it.


----------



## opheen

nevartojau said:


> I am rocking MSI Sea Hawk X RTX 3080. It's LHR 10GB card, hybrid cooled (water and air) anmd it has 2x8pin connectors. I bought it because that was the only GPU available in Norway at the time I was looking for a GPU.
> 
> Anyone has any input on a OC approach? It's power limited, so I guess the possibilities are very limited and I haven't seen anyone experimenting with custom BIOS on it.


What is the stock TDP att ? and is it DualBios on the card ?  375w is max on a 2x8pin 3080 Without Shunt resistors.


----------



## nevartojau

opheen said:


> What is the stock TDP att ? and is it DualBios on the card ?  375w is max on a 2x8pin 3080 Without Shunt resistors.



It says 320W is the max and it's single BIOS card which makes it risky to frick around. Fun times.


----------



## opheen

nevartojau said:


> It says 320W is the max and it's single BIOS card which makes it risky to frick around. Fun times.


And the slider is att 100% and stops there ?. Yeah you need to find a card with the same portlayout like 3xDP and 1xHDMI2.1 if you have intel K cpu or amd G with grapichs on its not that hard to recover from a bad flash. i know that asus TUF 3080 oc have 375w Max TDP thats a 2x8pin card, i have 2 3080's one card with 2x8pin 370w tdp stock 10gb LHR AORUS Waterforce WB. My card have 3xDP and 3xHDMI so it could be funny with your ports.


----------



## nevartojau

opheen said:


> And the slider is att 100% and stops there ?. Yeah you need to find a card with the same portlayout like 3xDP and 1xHDMI2.1 if you have intel K cpu or amd G with grapichs on its not that hard to recover from a bad flash. i know that asus TUF 3080 oc have 375w Max TDP thats a 2x8pin card, i have 2 3080's one card with 2x8pin 370w tdp stock 10gb LHR AORUS Waterforce WB. My card have 3xDP and 3xHDMI so it could be funny with your ports.



Yeah, the current BIOS has zero power adjustment, the slider is at 100% and that is the max of 320W. I feel like this card could kick harder with a little bit of extra juice. It stays pretty cool and junction temps are max 80 C under any load. In gaming it's usually 72 C on the junction. 


I don't have a spare GPU or integrated graphics. That's the case. I don't mind shorting some pins if I brick the card, but then I'm left with no display 

But I have this urge to try. It's a petty no one has tried anything with this card. Can't find **** on internet.

I need to get a spare cheap GPU. Something from GT series. For ****ery like this.


----------



## opheen

nevartojau said:


> Yeah, the current BIOS has zero power adjustment, the slider is at 100% and that is the max of 320W. I feel like this card could kick harder with a little bit of extra juice. It stays pretty cool and junction temps are max 80 C under any load. In gaming it's usually 72 C on the junction.
> 
> 
> I don't have a spare GPU or integrated graphics. That's the case. I don't mind shorting some pins if I brick the card, but then I'm left with no display
> 
> But I have this urge to try. It's a petty no one has tried anything with this card. Can't find **** on internet.
> 
> I need to get a spare cheap GPU. Something from GT series. For ****ery like this.


72c with water cooling is more than i expected, not that 72c is high, its better than most cards. mine is like 55c max . this bios could work on your card mby ASUS RTX 3080 VGA Bios Collection: Asus RTX 3080 10 GB | TechPowerUp 366w MAX TDP.


----------



## opheen

opheen said:


> 72c with water cooling is more than i expected, not that 72c is high, its better than most cards. mine is like 55c max . this bios could work on your card mby ASUS RTX 3080 VGA Bios Collection: Asus RTX 3080 10 GB | TechPowerUp 366w MAX TDP.


i Changed too the right link now.


----------



## nevartojau

opheen said:


> 72c with water cooling is more than i expected, not that 72c is high, its better than most cards. mine is like 55c max . this bios could work on your card mby ASUS RTX 3080 EKWB Specs | TechPowerUp GPU Database 366w MAX TDP.



The GPU stays at 60 ish while gaming, but the memory junction goes to 72 - 80 C. I was talking about the memory junction. I also Use stock curves on pump and fan, so radiator fans don't even spin below 60 C on the GPU.


----------



## opheen

nevartojau said:


> The GPU stays at 60 ish while gaming, but the memory junction goes to 72 - 80 C. I was talking about the memory junction. I also Use stock curves on pump and fan, so radiator fans don't even spin below 60 C on the GPU.


With my card i have to go down att 0.944v and 2010 Mhz +850mem to have stable clocks , if i go any higher on volt it runs out of power .. with 320w you want to go down att 0.895v- 0.944v and mby 1850-1980Mhz on core. i use TimeSpy when i undervolt my cards.


----------



## opheen

nevartojau said:


> The GPU stays at 60 ish while gaming, but the memory junction goes to 72 - 80 C. I was talking about the memory junction. I also Use stock curves on pump and fan, so radiator fans don't even spin below 60 C on the GPU.


This is how i do it and hope this will help you . i am not sayin this is the best way, but this way works.


----------



## opheen

nevartojau said:


> Yeah, the current BIOS has zero power adjustment, the slider is at 100% and that is the max of 320W. I feel like this card could kick harder with a little bit of extra juice. It stays pretty cool and junction temps are max 80 C under any load. In gaming it's usually 72 C on the junction.
> 
> 
> I don't have a spare GPU or integrated graphics. That's the case. I don't mind shorting some pins if I brick the card, but then I'm left with no display
> 
> But I have this urge to try. It's a petty no one has tried anything with this card. Can't find **** on internet.
> 
> I need to get a spare cheap GPU. Something from GT series. For ****ery like this.
> [/QUOTE


That EK WB 3080 vBios want work , since it is a FHR and not LHR . you have to use vBios from a LHR card to make it work.


----------



## Shadowzero_BR

Hi guys ! I have one RTX 3080 Galax SG LHR 10Gb and the 320w TDP is an issue when overclocking. I searched a custom bios on techpowerup and only the stock bios for LHR model is avaliable there. My question is : Can I download a similar Galax 10gb model Bios (ex : Galax Gamer White Ver., with 340w/360w TDP) and apply on mine ? There's a risk of brick my VGA ? Sorry for bad english, it's not my first lang. 
If it´s not possible to flash the bios, there's any tip for performance gains ?
I tried core voltage, clock.. but no gains, because the TDP is the cap.
I will try to keep temperatures as low as possible to keep the frequency boost.(any tip here ?)
Thanks !


----------



## fray_bentos

Shadowzero_BR said:


> Hi guys ! I have one RTX 3080 Galax SG LHR 10Gb and the 320w TDP is an issue when overclocking. I searched a custom bios on techpowerup and only the stock bios for LHR model is avaliable there. My question is : Can I download a similar Galax 10gb model Bios (ex : Galax Gamer White Ver., with 340w/360w TDP) and apply on mine ? There's a risk of brick my VGA ? Sorry for bad english, it's not my first lang.
> If it´s not possible to flash the bios, there's any tip for performance gains ?
> I tried core voltage, clock.. but no gains, because the TDP is the cap.
> I will try to keep temperatures as low as possible to keep the frequency boost.(any tip here ?)
> Thanks !


Undervolt.


----------



## Shadowzero_BR

fray_bentos said:


> Undervolt.


Thanks for response. Do you know where I can found information about the default voltage ? On MSI Afterburner the core voltage is disabled and showing the minimal adjustment possible : 700mV. But when I enable the bar and adjust to 850mV or 900mV, the VGA under perform. Even with cold temps. I'm a bit confused on this setting.


----------



## blurp

A simple Google search will show plenty of tutorials. I used this one:






Rtx 3000 series undervolt discussion


Let's talk undervolting. Creating a custom voltage curve should both reduce power usage and possibly increase performance (if it reduces Temps enough to get you into higher boost bins). Here is the guide I have created, thanks to everyone who gave feedback and helped get this tested and revised...



hardforum.com





I’m @ 1875 887 mv. Max watt around 300-305W. 50-55C. Low noise.


----------



## Shadowzero_BR

blurp said:


> A simple Google search will show plenty of tutorials. I used this one:
> 
> 
> 
> 
> 
> 
> Rtx 3000 series undervolt discussion
> 
> 
> Let's talk undervolting. Creating a custom voltage curve should both reduce power usage and possibly increase performance (if it reduces Temps enough to get you into higher boost bins). Here is the guide I have created, thanks to everyone who gave feedback and helped get this tested and revised...
> 
> 
> 
> hardforum.com
> 
> 
> 
> 
> 
> I’m @ 1875 887 mv. Max watt around 300-305W. 50-55C. Low noise.


Thanks ! I'm on my job right now. I will try later tonight.


----------



## Shadowzero_BR

Flashing bios on a LHR VGA must be LHR bios ? Because there's almost nothing LHR custom bios on techpowerup.


----------



## yzonker

Shadowzero_BR said:


> Flashing bios on a LHR VGA must be LHR bios ? Because there's almost nothing LHR custom bios on techpowerup.


Yes. The device ID has to be the same.


----------



## Shadowzero_BR

yzonker said:


> Yes. The device ID has to be the same.


Do you know why the lack of custom LHR bios ? Protection ?


----------



## yzonker

Shadowzero_BR said:


> Do you know why the lack of custom LHR bios ? Protection ?


Probably just haven't been uploaded by anyone. Did you check the unverified list?


----------



## Shadowzero_BR

yzonker said:


> Probably just haven't been uploaded by anyone. Did you check the unverified list?


Unverified list doesn't specify the exact model. Must check every ID to confirm LHR. But in the verified list we rarelly find LHR, most LHR are the default bios for restore purpose.
Strange.


----------



## Audioboxer

Poster on the EVGA forums measured the actual pin power draw with that MSI Suprim bios on a 12GB model and



> After reading your posts about the MSI bios I decided to try it on my 3080 12GB FTW3 card. With the normal bios it will draw 450W under heavy load such as furmark but under normal loads only around 420W. With the MSI Suprim X bios I measured the current on each 8 pin supply with a clamp meter while running furmark and found that it was drawing 163W, 225W, and 83W from input 1,2, and 3 and software reported 63W from the PCIe slot for a total of 534W! As a comparison with the normal EVGA bios I read 135W, 188W, 84W from inputs 1,2,3 and 47W from PCIe slot for a total of 454W. The card will also draw more power under normal use with the MSI bios pulling around 480W instead of 420 in benchmarks.


Was literally getting up to 534w lol. 225W on pin #2.

So I guess that explains why this MSI bios pretty much lets these EVGA 3080 cards do whatever they want when it comes to power draw lol.


----------



## carneb

Audioboxer said:


> Poster on the EVGA forums measured the actual pin power draw with that MSI Suprim bios on a 12GB model and
> 
> 
> 
> Was literally getting up to 534w lol. 225W on pin #2.
> 
> So I guess that explains why this MSI bios pretty much lets these EVGA 3080 cards do whatever they want when it comes to power draw lol.


That was me who posted on the EVGA forum. The Suprim X bios increases the power but there is still a limit somewhere. Under lighter loads the card will run at 1.1V all the time but with heavy loads (benchmarks, not furmark) it will still drop the voltage due to power limiting.

It will let the card draw around 510W in 3DMark and let me increase my Port Royal score to 13851.😁


----------



## Imprezzion

Maybe I should just commit and shunt my Gigabyte Gaming OC 3080 but I'm quite scared to do so due to Gigabyte using those weird flat 8 pin cable adapters. I do not use the Gigabyte conversion block but rather ModDYI custom cables direct from normal PCI-E to the flat connector and the card has a Bykski full cover block but still, I have no idea how much power those cables and connectors can somewhat safely pull.

I'm using a Z590 Maximus XIII Hero which should handle a bit more wattage out of PCI-E just fine but..


----------



## Audioboxer

Imprezzion said:


> Maybe I should just commit and shunt my Gigabyte Gaming OC 3080 but I'm quite scared to do so due to Gigabyte using those weird flat 8 pin cable adapters. I do not use the Gigabyte conversion block but rather ModDYI custom cables direct from normal PCI-E to the flat connector and the card has a Bykski full cover block but still, I have no idea how much power those cables and connectors can somewhat safely pull.
> 
> I'm using a Z590 Maximus XIII Hero which should handle a bit more wattage out of PCI-E just fine but..


Try installing an MSI BIOS, seems they like throwing 500w+ at cards of other manufacturers


----------



## Imprezzion

Audioboxer said:


> Try installing an MSI BIOS, seems they like throwing 500w+ at cards of other manufacturers


Well.. it has dual BIOS and my other BIOS slot at the moment has a EVGA XC3 BIOS on it which gives it a little more power but somehow breaks the VRAM and it only detects and uses 8GB of it in stead of 10 lol.. I can flash a MSI 2x8 pin from a Ventus or whatever has 2x8 for MSI on it just to see what happens 😂


----------



## mouacyk

Imprezzion said:


> Maybe I should just commit and shunt my Gigabyte Gaming OC 3080 but I'm quite scared to do so due to Gigabyte using those weird flat 8 pin cable adapters. I do not use the Gigabyte conversion block but rather ModDYI custom cables direct from normal PCI-E to the flat connector and the card has a Bykski full cover block but still, I have no idea how much power those cables and connectors can somewhat safely pull.
> 
> I'm using a Z590 Maximus XIII Hero which should handle a bit more wattage out of PCI-E just fine but..


I got the exact card and block and made my own 18g PCIe connectors to be very short for my case. Been using since Feb up to 450W, no issues.


----------



## Kaltenbrunner

Where do u guys think the 3080 is vs the 6900xt and 3080ti these days ? The 3080 is just a touch cheaper than the 6900xt here, and the 3080ti is about $150US more.

So the 3080ti has the really big NV tax, I should avoid it I guess, at least new.

pro's con's of 3080 vs 6900xt ??? I know the next gen is next fall/winter, but then I'd be fine till 2024 @1440p

The benches are all over the place, HWUnboxed has 3 month old a review of 3080 vs 6900xt, w/ the 3080 winning by 2% in 50 1440p games, AMD does better in DX12

But then the 1year old review of 3080ti, the 6900xt beats it and even the 3090 a few times, so I'm confused. I'm not into DLSS or ray tracing, so the 6900xt is looking good again.








ok then to keep going


----------



## yzonker

The 6900xt is very competitive with the 3080 except in RT. They do fall down pretty bad in RT. If you like to OC, the AMD cards also have the advantage of bios modding through More Power Tools. I think Nvidia drivers are still a little better, although I don't have both to compare. Just based on comments I've seen various places.


----------



## Panchovix

Kaltenbrunner said:


> Where do u guys think the 3080 is vs the 6900xt and 3080ti these days ? The 3080 is just a touch cheaper than the 6900xt here, and the 3080ti is about $150US more.
> 
> So the 3080ti has the really big NV tax, I should avoid it I guess, at least new.
> 
> pro's con's of 3080 vs 6900xt ??? I know the next gen is next fall/winter, but then I'd be fine till 2024 @1440p
> 
> The benches are all over the place, HWUnboxed has 3 month old a review of 3080 vs 6900xt, w/ the 3080 winning by 2% in 50 1440p games, AMD does better in DX12
> 
> But then the 1year old review of 3080ti, the 6900xt beats it and even the 3090 a few times, so I'm confused. I'm not into DLSS or ray tracing, so the 6900xt is looking good again.
> 
> 
> 
> 
> 
> 
> 
> 
> ok then to keep going


At the same price IMO I would go for the 6900XT, since I also really hate how low overclocking headroom do you have on NVIDIA cards on Ampere, vs Turing or Pascal for example.
Though if you want ray tracing, the NVIDIA card will do better, but honestly on my case I never use it, not worth the performance hit (with the exception of Control)


----------



## andsoitgoes

Hi everyone - I've got an EVGA 3080 XC3 Ultra Hydro Copper card and I'm trying to push this a little bit. I've never played around with overclocking, but I figure I've put all this effort into a WC system, might as well make it scream a little bit since I have the room.

Running the Time Spy demo after scanning the OC settings barely pushed it past 40, but I'm really not sure how to match clock frequency with other people who have similar setups (same CPU/GPU/type of motherboard)

For example, here's a comparison with mine on the left:

Clock frequency 2,010 MHz (1,440 MHz) 2,115 MHz (1,440 MHz)
Average clock frequency 1,929 MHz 2,006 MHz
Memory clock frequency 1,213 MHz (1,188 MHz) 1,332 MHz (1,188 MHz)
Average memory clock frequency 1,204 MHz 1,322 MHz

I know a lot of this comes down to the silicon lottery, however I'd love to at least try to reach the same numbers and see what happens.

What I've done at this point is run the OC Calibration in MSI Afterburner and gone with that, so my core clock is on a curve, and memory clock is at +1350

Thanks for the help.


----------



## zebra_hun

New Bench result, ACV:
2190MHz/1.04V
2220MHz/1.06V


----------



## BenchAndGames

I have a question for you guys, I just realized that my RTX 3080 TUF has doing more than 150W per pin connector, dont know if this can be dangerous or not.

The second thing is I tought 12v rail it has to be max at 12v and not 12.26v that I have on some points.

Is this normal to be like this, I remember with my old RTX 3070 max that I saw was 12v but not much more.

True is with 3070 I had different PSU (Corsair RM750x V2)
But now with the 3080 I have (Corsair RM850x 2021)

So is everything normal or something bad happens ?










Thanks


----------



## Imprezzion

BenchAndGames said:


> I have a question for you guys, I just realized that my RTX 3080 TUF has doing more than 150W per pin connector, dont know if this can be dangerous or not.
> 
> The second thing is I tought 12v rail it has to be max at 12v and not 12.26v that I have on some points.
> 
> Is this normal to be like this, I remember with my old RTX 3070 max that I saw was 12v but not much more.
> 
> True is with 3070 I had different PSU (Corsair RM750x V2)
> But now with the 3080 I have (Corsair RM850x 2021)
> 
> So is everything normal or something bad happens ?
> 
> View attachment 2567583
> 
> 
> Thanks


Totally normal and well within limits.


----------



## chinyonghui

Hi all,

I have a ROG RTX 3080 and am new to overclocking GPU and would be grateful if anyone can provide a guide or steps for me to follow.

Also, should I be using ASUS GPU Tweak III or MSI Afterburner?

And is what are the recommended tests to ensure stability? (eg. Port Royal, Fire Strike, Time Spy)

Any help or advice is much appreciated!


Thanks in advance!


----------



## opheen

chinyonghui said:


> Hi all,
> 
> I have a ROG RTX 3080 and am new to overclocking GPU and would be grateful if anyone can provide a guide or steps for me to follow.
> 
> Also, should I be using ASUS GPU Tweak III or MSI Afterburner?
> 
> And is what are the recommended tests to ensure stability? (eg. Port Royal, Fire Strike, Time Spy)
> 
> Any help or advice is much appreciated!
> 
> 
> Thanks in advance!


Hey! I think MSI Afterburner is the best tool to use : - ) And Port Royal, Fire Strike, Time Spy, OCCT is pretty good + The games you playin to ensure its stable. I use Call of duty Warzone "Caldera" to ensure my tune is rock solid.


----------



## stahlhart

chinyonghui said:


> Hi all,
> 
> I have a ROG RTX 3080 and am new to overclocking GPU and would be grateful if anyone can provide a guide or steps for me to follow.
> 
> Also, should I be using ASUS GPU Tweak III or MSI Afterburner?
> 
> And is what are the recommended tests to ensure stability? (eg. Port Royal, Fire Strike, Time Spy)
> 
> Any help or advice is much appreciated!
> 
> 
> Thanks in advance!


You're on the right track so far -- what @opheen said, also look at undervolting as an option once you've spent some time familiarizing yourself with the card.


----------



## Imprezzion

Why is my card so weird.. I ran 1995 @ 0.993 / 1980 @ 0.987v curve undervolted which without ray tracing is fine power wise on 1080p but with 2x resolution scale in some games and ray tracing in for example shadow of the tomb raider or Cyberpunk it would still hit the limiter. I cannot go even 1 bin higher or it crashes within minutes in games. VRAM is at +1200 and is not error correcting. It's under a full cover block, core sits in the 46c range, hotspot 62 ish, VRAM 70-72 ish (bad stock Bykski pads)

So, I wanted to build a new undervolt curve at a lower voltage and found that with 0.906v it would sit right at 315-320w in for example Cyberpunk 2077 all max ray tracing psycho no DLSS 1080p so that's the voltage I am going for. So, I steadily raised the curve clocks at 0.906v to see how high it would still clock expecting something like mid 1800's. To my surprise I've been playing Cyberpunk for like half an hour at 1950 core (~1928 effective clock) and it's been fine. Hasn't crashed or done anything weird yet.. so why can I run 1950 a 0.906v fine but can't run 2010 as high as 0.993v.. that scaling makes zero sense to me..


----------



## chinyonghui

Thanks for your replies @stahlhart @opheen !!

Oh by the way, what software do you guys recommend to check on temps and other data regarding the GPU?

I have HWInfo64 and AIDA64 Extreme.

Or is there no difference between the two?

Thanks in advance!!


----------



## stahlhart

chinyonghui said:


> Thanks for your replies @stahlhart @opheen !!
> 
> Oh by the way, what software do you guys recommend to check on temps and other data regarding the GPU?
> 
> I have HWInfo64 and AIDA64 Extreme.
> 
> Or is there no difference between the two?
> 
> Thanks in advance!!


HWiNFO64 pretty much has you completely covered, should be able to read any and all sensors in your build.


----------



## vortex240

Does anyone know if there is a higher power bios for a Gigabyte 3080 10GB gaming OC. Stock is 370watts.

Is there another one that can be cross flashed from another brand perhaps?

I've been looking around but haven't come across one.


----------



## Imprezzion

vortex240 said:


> Does anyone know if there is a higher power bios for a Gigabyte 3080 10GB gaming OC. Stock is 370watts.
> 
> Is there another one that can be cross flashed from another brand perhaps?
> 
> I've been looking around but haven't come across one.


I've been looking for one since the card released, no. There is none. I tried every BIOS from every brand in existence with 2x8 pin, the best is still the latest Gigabyte original one with ReBAR support (OC mode, not the quiet mode one). Undervolt it to pull at max 0.956v @ 1080p or 0.906v at 1440p / 4K (more power draw at higher render resolutions) and you can run anything including ray tracing without power limiting and with max VRAM OC.

I run mine at 1965 @ 0.906v with +1400 VRAM and it sits comfortably around 320-325w in the most demanding titles and doesn't ever throttle.

P.S. the RGB and the fans are counted in the cards power budget! Turning off RGB and reducing fan speed or running a fullcover waterblock that isn't connected to the cards fanhub or RGB connector like I do helps quite a lot. 10-20w total.


----------



## vortex240

Imprezzion said:


> I've been looking for one since the card released, no. There is none. I tried every BIOS from every brand in existence with 2x8 pin, the best is still the latest Gigabyte original one with ReBAR support (OC mode, not the quiet mode one). Undervolt it to pull at max 0.956v @ 1080p or 0.906v at 1440p / 4K (more power draw at higher render resolutions) and you can run anything including ray tracing without power limiting and with max VRAM OC.
> 
> I run mine at 1965 @ 0.906v with +1400 VRAM and it sits comfortably around 320-325w in the most demanding titles and doesn't ever throttle.
> 
> P.S. the RGB and the fans are counted in the cards power budget! Turning off RGB and reducing fan speed or running a fullcover waterblock that isn't connected to the cards fanhub or RGB connector like I do helps quite a lot. 10-20w total.


Thanks, that's some great info! I am running the latest 94.02.71.C0.29 - I think the build date is november 2021.

I'm also using liquid metal on it with few extra thermal pads and a backplate heatsink. My mem can only to 700+ anything more will just crash or result in lower perf.

I'm at 1935 @ 0.9v - with RT on, this will max power limit in games like Metro Ex or Resident Evil 3 Remake (Ray tracing on).

Seems we arrived at basically the same, at least Gigabyte QC looks to be consistent in their samples.


----------



## Imprezzion

vortex240 said:


> Thanks, that's some great info! I am running the latest 94.02.71.C0.29 - I think the build date is november 2021.
> 
> I'm also using liquid metal on it with few extra thermal pads and a backplate heatsink. My mem can only to 700+ anything more will just crash or result in lower perf.
> 
> I'm at 1935 @ 0.9v - with RT on, this will max power limit in games like Metro Ex or Resident Evil 3 Remake (Ray tracing on).
> 
> Seems we arrived at basically the same, at least Gigabyte QC looks to be consistent in their samples.


In metro enhanced, control, far cry 6 and cyberpunk on 1080p with RT max it's not limiting for me but as I said mine does not have the fans or RGB connected and is running at around 48c under water so less draw. It does seem most Gigabyte cards run about the same clocks yeah. 1935-1965 at 0.9. Just a shame yours has such bad VRAM. +700 is quite low. Mine starts to lose performance due to ECC at +1500 so I keep it at +1400 around 70c junction and it's been fine for 6 months now.


----------



## BenchAndGames

Guys have a question

It is dangerous for the card to run VRAM temperature at 100ºC while playing ? Not talking about 24/7 mining at this temp, I mean just while playing games.


----------



## opheen

BenchAndGames said:


> Guys have a question
> 
> It is dangerous for the card to run VRAM temperature at 100ºC while playing ? Not talking about 24/7 mining at this temp, I mean just while playing games.


I think the temperature is too high, especially on a 3080 card that has the memory modules on the same side as the cooler, but whether it is dangerous or not I should be careful in answering, but it is worryingly high.


----------



## Imprezzion

BenchAndGames said:


> Guys have a question
> 
> It is dangerous for the card to run VRAM temperature at 100ºC while playing ? Not talking about 24/7 mining at this temp, I mean just while playing games.


Up to 90c would be acceptable to me but 100c is too much.


----------



## BenchAndGames

Imprezzion said:


> Up to 90c would be acceptable to me but 100c is too much.


Thanks but im not asking about personal acceptable temps. Im asking if it can be damaged if sometimes can peak 100ºC VRAM Tj.
I see on the official micron site, that they said normal operatrion will be between 95º to 105ºC but they state Tc not Tj and we all know that TJ is quite higher.

Did anyone got it damaged ?


----------



## Imprezzion

BenchAndGames said:


> Thanks but im not asking about personal acceptable temps. Im asking if it can be damaged if sometimes can peak 100ºC VRAM Tj.
> I see on the official micron site, that they said normal operatrion will be between 95º to 105ºC but they state Tc not Tj and we all know that TJ is quite higher.
> 
> Did anyone got it damaged ?


Haven't ran mine long enough to damage them at those temps before I waterblocked it. I did have a few mates mining with similar cards and they ran fine for months on 90-100c so I don't think it'll damage it. It will probably not OC as well tho.


----------



## chinyonghui

stahlhart said:


> HWiNFO64 pretty much has you completely covered, should be able to read any and all sensors in your build.


Apologies for the late reply. Ok thanks for your help!


----------



## chinyonghui

Imprezzion said:


> In metro enhanced, control, far cry 6 and cyberpunk on 1080p with RT max it's not limiting for me but as I said mine does not have the fans or RGB connected and is running at around 48c under water so less draw. It does seem most Gigabyte cards run about the same clocks yeah. 1935-1965 at 0.9. Just a shame yours has such bad VRAM. +700 is quite low. Mine starts to lose performance due to ECC at +1500 so I keep it at +1400 around 70c junction and it's been fine for 6 months now.


Hi, 

I am new to doing a benchmark on metro enhanced and am unsure of what to lookout for. Does it also produce a test report card after it runs some scenes like in 3D mark?

Any help is appreciated!


Thanks in advance!


----------



## chinyonghui

Hi all, 

I am new to overclocking my GPU in MSI Afterburner and would like to check if I should be adjusting my Core clock or Memory clock first? Or if I should adjust both at the same time?

Any help is appreciated!


Thanks in advance!


----------



## ogmadvlad

chinyonghui said:


> Hi all,
> 
> I am new to overclocking my GPU in MSI Afterburner and would like to check if I should be adjusting my Core clock or Memory clock first? Or if I should adjust both at the same time?
> 
> Any help is appreciated!
> 
> 
> Thanks in advance!


with a 3080 you can start with the basics of +100 core and +800 memory, run a benchmark and keep track of the score, temps actual core clocks during the run etc. then you adjust your settings and run the benchmark again and see if you get an improvement, ocing ram too high you will start getting lower performance once it starts error correcting. if your core is too high you will get a crash.


----------



## ogmadvlad

Interesting thing I found playing with different bioses on my my msi gaming z trio 3080 12gb. using the suprim bios I am limited to 400 watts and on Port royal I got my highest score of MSI bios 13315 averaging 2099 mhz. With the Asus strix bios I was able to push 440 watts peak and averaged 2130 mhz but got a lower score. strix bios 13284. Same thing with Timespy Msi bios 20490 gpu score at 400w peak 2089 average clock. and Im barely breaking 20000 gpu score with the 440w power limit.


----------



## Alexandru Lascu

Anybody played with the zotac trinity 12gb? might flash the strix/ftw3 bios later, but those being 3x8pin and the zotac only 2x8 i dont think the bios will spread the power correctly and on top of that 75w pcie + 150w x 2 = 375w while those bioses are 400w/450w


----------



## mouacyk

ogmadvlad said:


> Interesting thing I found playing with different bioses on my my msi gaming z trio 3080 12gb. using the suprim bios I am limited to 400 watts and on Port royal I got my highest score of MSI bios 13315 averaging 2099 mhz. With the Asus strix bios I was able to push 440 watts peak and averaged 2130 mhz but got a lower score. strix bios 13284. Same thing with Timespy Msi bios 20490 gpu score at 400w peak 2089 average clock. and Im barely breaking 20000 gpu score with the 440w power limit.


Effective clock was likely throttled by increased temperature. -1 bin starting at 40c


----------



## Imprezzion

mouacyk said:


> Effective clock was likely throttled by increased temperature. -1 bin starting at 40c


This. Even tho my "core clock" doesn't chance from 1950, effective sticks to around 1950 up to 40c then drops to around 1935 since I can't keep it under 40c with the CPU also dumping heat in the loop. Usually runs around 45-48c in games.


----------



## BenchAndGames

ogmadvlad said:


> Interesting thing I found playing with different bioses on my my msi gaming z trio 3080 12gb. using the suprim bios I am limited to 400 watts and on Port royal I got my highest score of MSI bios 13315 averaging 2099 mhz. With the Asus strix bios I was able to push 440 watts peak and averaged 2130 mhz but got a lower score. strix bios 13284. Same thing with Timespy Msi bios 20490 gpu score at 400w peak 2089 average clock. and Im barely breaking 20000 gpu score with the 440w power limit.


Damn thats very nice I have the same card as you and my card out of the box in port royal I got 12628 (I scored 12 628 in Port Royal) with 390 watts as it is original bios.


----------



## trivium nate

.... NVM


----------



## chinyonghui

Hi all,

I have just overclocked my ROG RTX 3080 to Core clock (+150) Memory clock (+1500) on MSI Afterburner.

I am now looking to undervolt it but I have never done it before.

I have also watched youtube videos but still don't quite get it.

Is anyone able to give me advise or guide me on this?

Any help is much appreciated!


Thanks in advance!


----------



## opheen

chinyonghui said:


> Hi all,
> 
> I have just overclocked my ROG RTX 3080 to Core clock (+150) Memory clock (+1500) on MSI Afterburner.
> 
> I am now looking to undervolt it but I have never done it before.
> 
> I have also watched youtube videos but still don't quite get it.
> 
> Is anyone able to give me advise or guide me on this?
> 
> Any help is much appreciated!
> 
> 
> Thanks in advance!


Look att side 255 i have a guide


----------



## chinyonghui

opheen said:


> Look att side 255 i have a guide


Alright I'll go take a look

Thanks!


----------



## mxthunder

Just got done trying to swap out the thermal pads on my 3080 FE. What an absolute nightmare.
Had to take the card apart 4 times. Got everything back together with 2mm on the front side, and 2mm stuff on the back, core temps skyrocketed within a minute or so of heaven. Took apart, inspected, thought maybe I had too much thermal paste so i used less. Temps were worse. Did some more research and found out that some of the 3080 FEs require 1.5mm on the memory, and 2mm on the rest of the front. Ordered $30 more worth of Gelid stuff. Using the 1.5mm pads on the memory, I am not sure they are making good contact. I did a dry fit and pulled the PCB back off - there was only slight impressions on one bank of memory, and the other side - I took a tiny tiny scrap of pad and placed over top of the big pad to see if it would get smashed - and it barely did. I feel like its not making good contact with the memory. Put the card back in and ran heaven - core temps are good now, but memory is still 96+ *C wheras it was 104*C beforehand. Feel like it was a complete waste of time.


----------



## Dreamliner

mxthunder said:


> core temps are good now, but memory is still 96+ *C wheras it was 104*C beforehand. *Feel like it was a complete waste of time*.


As is most of the forum DIY's. It's just a bunch of obsessive hardware hobbyists looking for things to do. I'm not saying the cards are perfect, but I'm sure manufacturers try to limit RMA's.


----------



## MikeS3000

Not that it's the end of the world, but has anyone else had their RGB fail on a gigabyte gaming OC card? I've removed it from the motherboard and reinserted and it is blacked out.


----------



## Older gamer

mxthunder said:


> Just got done trying to swap out the thermal pads on my 3080 FE. What an absolute nightmare.
> Had to take the card apart 4 times. Got everything back together with 2mm on the front side, and 2mm stuff on the back, core temps skyrocketed within a minute or so of heaven. Took apart, inspected, thought maybe I had too much thermal paste so i used less. Temps were worse. Did some more research and found out that some of the 3080 FEs require 1.5mm on the memory, and 2mm on the rest of the front. Ordered $30 more worth of Gelid stuff. Using the 1.5mm pads on the memory, I am not sure they are making good contact. I did a dry fit and pulled the PCB back off - there was only slight impressions on one bank of memory, and the other side - I took a tiny tiny scrap of pad and placed over top of the big pad to see if it would get smashed - and it barely did. I feel like its not making good contact with the memory. Put the card back in and ran heaven - core temps are good now, but memory is still 96+ *C wheras it was 104*C beforehand. Feel like it was a complete waste of time.


Not sure if this is helpful or not...
This week I bought a shoddy ([an obviously] ex-mining GPU) Gigabyte Gaming OC 3080 on Yahoo Auctions (for my spare rig), the seller claimed it had recent thermal paste.. Lol, Baked. Anyway >
I cleaned the whole thing completely (soooo dirdy). At First it ran at about 100Degs, but recently it has ran cooler (average 82C in Benchmarking tests); so (to get to your point); I wondered if when I applied the thermal paste, that if, As it got Hot, Initially, that Maybe the thermal paste got Softer and (under screw pressure) SPREAD more thinly (almost like like kinda warm honey?) and increased the Contact area to help plate-to-plate GPU cooling?

NOTE also; (maybe I should have maybe I shouldn`t), but when I was getting ready to put the Card back together, I decided to put a tiny drop of thermal paste dabbing all across the InDented thermal pads with my forefinger to try partially to Bridge any Gaps between the thermal pads and the GPU memory chips & such (was my theory), and wondered if that is why I got good results on Furmark 1440p today? 
As my i7-4790 RTX3080 rig easily beat a score by a similar GPU that was running via a WAAAAAYYYY better/Newer CPU (i5 12600K) which is ranked 165 places ahead of my CPU in the UserBenchmark listings (where I only mention it because the old i7-4790K is supposed to be around a 12-20% bottleneck in gaming using the RTX3080).
This (mine) i7-4790K RTX3080
FurMark Scores 
vs his i5 12600K RTX3080





FurMark Scores


FurMark benchmark scores database - OpenGL / Vulkan



gpuscore.top





All I`m saying is; the temps could settle down.. give it a few days.


----------



## Older gamer

MikeS3000 said:


> Not that it's the end of the world, but has anyone else had their RGB fail on a gigabyte gaming OC card? I've removed it from the motherboard and reinserted and it is blacked out.


When I took the rough one I bought a few days ago, when disconnecting the 3 plugs I noticed they were VERY Tight, and I had to really work (careful) to take the thing apart. I thought that I could have damaged the plug mountss to the GPU motherboard when working with it, but maybe was lucky... Anyway, to try make sure I could remove the plugs in future without excess force, I put a small drop of ptfe slip on the plugs and contact pins, so they would come apart easily (if required) later. 

That or the LED narrow black cables could have cut when putting it back together. Take it apart again & check?


----------



## mouacyk

3840x1620. Anyone else getting 70% usage in this scene? It's typically high 90's everywhere else:


----------



## Panchovix

Tried the new benchmark of 3DMark, speedway

With latest drivers (522.something) I get pretty close to top 1 with my CPU/GPU combo, and 10th 3080 in general










With the mythic 466.63 drivers, I get 5214, but not validated it seems because the drivers are too old 










Does 3DMark update newest bench to be valid with older drivers?


----------



## mouacyk

Panchovix said:


> Tried the new benchmark of 3DMark, speedway
> 
> With latest drivers (522.something) I get pretty close to top 1 with my CPU/GPU combo, and 10th 3080 in general
> 
> View attachment 2576850
> 
> 
> With the mythic 466.63 drivers, I get 5214, but not validated it seems because the drivers are too old
> 
> View attachment 2576851
> 
> 
> Does 3DMark update newest bench to be valid with older drivers?


What kind of power are you pulling with that result? I've barely managed to scratch 5K with 430W+ at 2100MHz.
Also, you should share your results here: 3DMark Speed Way


----------



## Panchovix

mouacyk said:


> What kind of power are you pulling with that result? I've barely managed to scratch 5K with 430W+ at 2100MHz.
> Also, you should share your results here: 3DMark Speed Way


I was pulling between 440-460W lol, core clock was at 2160Mhz (it goes rapidly to 2145-2130 Mhz because stock TUF cooler at 55-60°C), mem OC +1500Mhz
And thanks for the link! Will post there


----------



## TK421

any unlocked 3080 10gb vbios yet? highest one is strix at 450w, while msi suprim is pre-shunted from factory?


----------



## chinyonghui

Hi all,

I have already overlocked my ROG RTX 3080 to Core clock (+150) Memory clock (+1500) on MSI Afterburner.

I am currently using a 240Hz monitor and I just found out that even on stock settings, my (Overwatch 2) in-game frame rate is already hitting max 240 fps.

Based on my understanding, the fps should not be more than the monitor's refresh rate.

I have a few questions that I would like to clarify:


Does overclocking only help to raise fps? If so, is there still a point in overclocking for me?
Are there other benefits to overclocking that I am unaware?
Should I just do an undervolt only?
Should I be limiting my in-game fps? (It is currently limited to 240Hz)
Any help is much appreciated!

Thanks in advance!


----------



## Imprezzion

chinyonghui said:


> Hi all,
> 
> I have already overlocked my ROG RTX 3080 to Core clock (+150) Memory clock (+1500) on MSI Afterburner.
> 
> I am currently using a 240Hz monitor and I just found out that even on stock settings, my (Overwatch 2) in-game frame rate is already hitting max 240 fps.
> 
> Based on my understanding, the fps should not be more than the monitor's refresh rate.
> 
> I have a few questions that I would like to clarify:
> 
> 
> Does overclocking only help to raise fps? If so, is there still a point in overclocking for me?
> Are there other benefits to overclocking that I am unaware?
> Should I just do an undervolt only?
> Should I be limiting my in-game fps? (It is currently limited to 240Hz)
> Any help is much appreciated!
> 
> Thanks in advance!


1. and 2. It helps with minimum fps as well of course not just average. Higher minimums are just as important.
3. If you wanna save on power and heat, of course. Try to find a balance between clocks and voltage that is most efficient for power and heat. 
4. I never limit my FPS. My monitor is also 240Hz with g-sync always enabled. It won't sync above 240Hz but I always just let it run whatever it wants. Don't have any tearing or weirdness in games like World of Tanks that runs at like 360 fps. 

I ran into the kinda nice issue that my previous undervolt no longer stays under my power limit thanks to my new 5800X3D creating a way higher GPU load because the CPU is so fast. I used to run 1995 @ 0.981v with memory at +1200 but that throttles hard now in games like Division 2 due to the way higher load and FPS. I made a new UV Curve and it seems my card really doesn't need a lot of voltage to run quite high core clocks. I have been testing several different games all evening and it seems to be perfectly happy at 1935 @ 0.887 / 1950 @ 0.893v and it stays _just _under the power limit running around 325-340w now and no throttling. It has a full cover block on it with Prolimatech PK-3 paste but terrible stock Bykski thermal pads on the VRAM so temperatures are around 55c core 67c hotspot and 82c VRAM. I have a sheet of Gelid Extreme pads laying around but I really can't be bothered to strip down the entire loop again just to replace the pads. The VRAM is totally happy running +1200 at 82c so yeah..


----------



## chinyonghui

Imprezzion said:


> 1. and 2. It helps with minimum fps as well of course not just average. Higher minimums are just as important.
> 3. If you wanna save on power and heat, of course. Try to find a balance between clocks and voltage that is most efficient for power and heat.
> 4. I never limit my FPS. My monitor is also 240Hz with g-sync always enabled. It won't sync above 240Hz but I always just let it run whatever it wants. Don't have any tearing or weirdness in games like World of Tanks that runs at like 360 fps.
> 
> I ran into the kinda nice issue that my previous undervolt no longer stays under my power limit thanks to my new 5800X3D creating a way higher GPU load because the CPU is so fast. I used to run 1995 @ 0.981v with memory at +1200 but that throttles hard now in games like Division 2 due to the way higher load and FPS. I made a new UV Curve and it seems my card really doesn't need a lot of voltage to run quite high core clocks. I have been testing several different games all evening and it seems to be perfectly happy at 1935 @ 0.887 / 1950 @ 0.893v and it stays _just _under the power limit running around 325-340w now and no throttling. It has a full cover block on it with Prolimatech PK-3 paste but terrible stock Bykski thermal pads on the VRAM so temperatures are around 55c core 67c hotspot and 82c VRAM. I have a sheet of Gelid Extreme pads laying around but I really can't be bothered to strip down the entire loop again just to replace the pads. The VRAM is totally happy running +1200 at 82c so yeah..


Hmm ok, thanks for sharing!

I have just done OC and UV on my GPU using OW2 as stress test and I have achieved 2130MHz @ 1081mV according to afterburner without any artifacting or game crashing.

I am only using HWinfo to monitor my stats and my max hotspot temp: 95.8°, max memory junction temp: 88° during gaming.

My core voltage and power limit are maxed out during OC and have not been changed since.

I have a few questions to clarify regarding UV and temps:

During UV, I noticed that my max hotspot temp and max memory junction temp keeps getting hotter. Is this normal? 
Is it possible to bring down the temps further by lowering the core voltage?
Thanks in advance!


----------



## fray_bentos

chinyonghui said:


> Hmm ok, thanks for sharing!
> 
> I have just done OC and UV on my GPU using OW2 as stress test and I have achieved 2130MHz @ 1081mV according to afterburner without any artifacting or game crashing.
> 
> I am only using HWinfo to monitor my stats and my max hotspot temp: 95.8°, max memory junction temp: 88° during gaming.
> 
> My core voltage and power limit are maxed out during OC and have not been changed since.
> 
> I have a few questions to clarify regarding UV and temps:
> 
> During UV, I noticed that my max hotspot temp and max memory junction temp keeps getting hotter. Is this normal?
> Is it possible to bring down the temps further by lowering the core voltage?
> Thanks in advance!


2. Yes, but that is not an undervolt, that's an overvolt and OC. A stock 3080 will not run as such high voltages as you show.


----------



## BIaze

Just recently bought a 3080 Strix 12GB model

I'm not sure if this is how it works but increasing the core in MSI AB always REDUCES power draw in timespy/timespy extreme stress tests. Whereas on my previous 30 series cards it won't do this

a +100> increase just drops power draw from 420-440 to 380-400 with perfcap reason PWR even though this card has a 450w power limit.

Is this an issue on 3dmark's end or was this caused by a recent driver change that I'm unaware of?






video of the problem


----------



## BIaze

carneb said:


> That was me who posted on the EVGA forum. The Suprim X bios increases the power but there is still a limit somewhere. Under lighter loads the card will run at 1.1V all the time but with heavy loads (benchmarks, not furmark) it will still drop the voltage due to power limiting.
> 
> It will let the card draw around 510W in 3DMark and let me increase my Port Royal score to 13851.😁


was this on 3080 10GB or 12GB model?


----------



## Imprezzion

Man I had to drop my undervolt way way down with this new CPU.. on my old 11900KF I could run 1980 @ 0.925 all day in any game but with the 5800X3D the load is so much higher that hits power limits easily and wasn't stable anymore. Dropping to 1950 @ 0.906 wasn't stable at all either. I had to drop all the way down to 1905 @ 0.900 to stay stable with +1000 memory and not power throttle lol.. Some lighter games like Halo MCC or BDO run at like 240w now but games like Division 2 or Cyberpunk with Psycho RT or Forza Horizon 5 with RT Ultra still hit 330-335w even now.. If you have the CPU to actually feed the card optimally it makes a huge difference.

On the plus side, it is very cool running now with such a low voltage even at 95-100% power draw it barely touches 48c core 55c hotspot with the full cover block.


----------



## noa25ro

Hi people ! I need some help. I have a 3080 eagle oc with a 340w witch has a max power draw of 340. I tried to flash a Gaming Oc bios which has a 370w max power draw using Nvflash64 but it dose not work . I can't even make a backup with nvflash64 (command used was nvflash64 -b xxx.rom). After the screen flickers a few times i get the error that i need to restart the system to be able to do anything. Tried to set protectoff and i get the same message. It seems like the GPU is somehow protected.


----------



## Imprezzion

noa25ro said:


> Hi people ! I need some help. I have a 3080 eagle oc with a 340w witch has a max power draw of 340. I tried to flash a Gaming Oc bios which has a 370w max power draw using Nvflash64 but it dose not work . I can't even make a backup with nvflash64 (command used was nvflash64 -b xxx.rom). After the screen flickers a few times i get the error that i need to restart the system to be able to do anything. Tried to set protectoff and i get the same message. It seems like the GPU is somehow protected.


Don't bother. The BIOS is internally the same. The Gaming OC / Vision / Eagle BIOS all limits at 340-345. I have one so.. The only BIOS that allows for more power limit (366) is the EVGA XC3 Ultra and it does draw more however it has the nasty side effect of glitching the VRAM meaning only 8GB is used not 10.

I could flash my Gaming OC to a ReBAR supported stock BIOS just fine with the latest nvflash using nvflash -6 nameofbios.rom even without --protectoff. Make sure to use the -6 flag.


----------



## noa25ro

Imprezzion said:


> Don't bother. The BIOS is internally the same. The Gaming OC / Vision / Eagle BIOS all limits at 340-345. I have one so.. The only BIOS that allows for more power limit (366) is the EVGA XC3 Ultra and it does draw more however it has the nasty side effect of glitching the VRAM meaning only 8GB is used not 10.
> 
> I could flash my Gaming OC to a ReBAR supported stock BIOS just fine with the latest nvflash using nvflash -6 nameofbios.rom even without --protectoff. Make sure to use the -6 flag.


So what you're saying is , that i can't do anything to get a bit more OOMF out of it ? . I forgot to mention that i OC'ed the card as is an it runs at 1950-2000 with spikes to 2040.


----------



## Panchovix

noa25ro said:


> So what you're saying is , that i can't do anything to get a bit more OOMF out of it ? . I forgot to mention that i OC'ed the card as is an it runs at 1950-2000 with spikes to 2040.


Your only way is to shunt mod, that's what I did with my TUF 3080. Not recommended though.


----------



## Imprezzion

noa25ro said:


> So what you're saying is , that i can't do anything to get a bit more OOMF out of it ? . I forgot to mention that i OC'ed the card as is an it runs at 1950-2000 with spikes to 2040.


That is correct. A heavy memory OC or shunt mod is the only way. Or lower temps significantly. Lower temps is less heat loss and more VRM efficiency so it can pick up a few watts. Also, going custom water and not running the stock fans or RGB frees up power budget as well. The fans and RGB are included in the total board power draw and count towards it. 

I run mine under a Bykski Gigabyte Gaming OC 3080/3090 block with the RGB externally powered and I run 1920 @ 0.900v with +1000 memory. It's by far the most efficiency I can get out of the card. Lighter games like BDO or Halo MCC run around 240w and heavy games like Division 2, Horizon 5 with RT Ultra or Cyberpunk RT Psycho around 320-335w so no throttling ever.


----------



## mouacyk

I have a Bykski block on my Eagle OC shunted and it does draw up to around 450W, before throttling. With a Supernova radiator, I am able to clock to 2100MHz @ 1.1v for games and 2200MHz for benches, while staying under 45C. In games, the jump from [email protected] only gives 1-3fps.


----------



## chinyonghui

fray_bentos said:


> 2. Yes, but that is not an undervolt, that's an overvolt and OC. A stock 3080 will not run as such high voltages as you show.


Hmm ok, are you able to guide me in doing an undervolt?

I have read quite a few ways to do it but am still not too sure.

Do I use the VF curve and flatten the graph into a straight line, starting from the end of the graph?


----------



## fray_bentos

chinyonghui said:


> Hmm ok, are you able to guide me in doing an undervolt?
> 
> I have read quite a few ways to do it but am still not too sure.
> 
> Do I use the VF curve and flatten the graph into a straight line, starting from the end of the graph?


Yes


----------



## chinyonghui

fray_bentos said:


> Yes


Ok, what do I do first?


----------



## pf100

chinyonghui said:


> Ok, what do I do first?


----------



## chinyonghui

fray_bentos said:


> Yes


Apologies, I misunderstood your reply.

Actually, that was what I did, and the lowest I got is 2130MHz @ 1081mV stable after stress testing.


----------



## fray_bentos

chinyonghui said:


> Apologies, I misunderstood your reply.
> 
> Actually, that was what I did, and the lowest I got is 2130MHz @ 1081mV stable after stress testing.
> 
> View attachment 2584158
> 
> View attachment 2584159


You have completely misunderstood what undervolting is. The line should level off at ~825 mV for a decent undervolt and your frequency will need to be a lot lower.


----------



## chinyonghui

fray_bentos said:


> You have completely misunderstood what undervolting is. The line should level off at ~825 mV for a decent undervolt and your frequency will need to be a lot lower.


Hmm dosen't that mean undoing my OC and losing any advantages of it as well?


----------



## pf100

chinyonghui said:


> Hmm dosen't that mean undoing my OC and losing any advantages of it as well?


First, lower voltage equals less heat, period. You can go any way you want with it. You can go for a higher clock than stock because the lower heat lets it boost higher, or you can go for a vastly cooler running card with lower clocks combined with lower voltage. This chart for the rtx 3080 with a 320 watt power limit should help illustrate the point, and keep in my mind the numbers will be a little different with your card depending on the silicon lottery.


----------



## fray_bentos

pf100 said:


> First, lower voltage equals less heat, period. You can go any way you want with it. You can go for a higher clock than stock because the lower heat lets it boost higher, or you can go for a vastly cooler running card with lower clocks combined with lower voltage. This chart for the rtx 3080 with a 320 watt power limit should help illustrate the point, and keep in my mind the numbers will be a little different with your card depending on the silicon lottery.


Hah that's my table


----------



## pf100

fray_bentos said:


> Hah that's my table


This is great news and I'm glad to have run into you. It must have been a tremendous amount of work. I saved this table during about a 2 or 3 day thing where I was trying to get my 3080 dialed in and your table was the single most helpful thing I found as far as seeing how voltages and frequencies scale with the 3080. Also, the numbers just happened to work with my particular card perfectly after I had wasted a lot of time figuring out exactly how to do undervolting on the 3080 and failed a lot the first day. About a day or two after I saved the table I wanted to contact you and say thanks and to ask if you had a larger size copy but I couldn't find that reddit thread again. I blew it up to around 125% size in gimp for better readability. So, do you have a bigger copy? And how long did it take to make the table? And thanks for making my undervolting journey a lot easier.


----------



## fray_bentos

pf100 said:


> This is great news and I'm glad to have run into you. It must have been a tremendous amount of work. I saved this table during about a 2 or 3 day thing where I was trying to get my 3080 dialed in and your table was the single most helpful thing I found as far as seeing how voltages and frequencies scale with the 3080. Also, the numbers just happened to work with my particular card perfectly after I had wasted a lot of time figuring out exactly how to do undervolting on the 3080 and failed a lot the first day. About a day or two after I saved the table I wanted to contact you and say thanks and to ask if you had a larger size copy but I couldn't find that reddit thread again. I blew it up to around 125% size in gimp for better readability. So, do you have a bigger copy? And how long did it take to make the table? And thanks for making my undervolting journey a lot easier.


I've got the excel somewhere, but on my phone right now. I just set different voltage values over 18+ months of gaming and the table built itself slowly.

Glad it was helpful to someone!


----------



## F7LTHY

fray_bentos said:


> Hah that's my table


Man I just saw this and I let a big "huh" lol
Ive had my 3080 for about a month and its a blower style +0% 320W PL and right off the bat went to undervolting it.. After about 3 days of tinkering and testing I settled on .825mV 1770mHz and called it my sweet spot.. Seeing your table makes me smile and happy that I made a good choice so thank you for your effort, its nice to see.


----------



## chibi

fray_bentos said:


> Hah that's my table


Hey Fray, for that table, are you just doing the alt + drag the highest point to 1785 freq, then set the 831mv point to flatline at 1785 and apply?


----------



## fray_bentos

chibi said:


> Hey Fray, for that table, are you just doing the alt + drag the highest point to 1785 freq, then set the 831mv point to flatline at 1785 and apply?


I reset to default, then open the VF curve, then shift the clock slider all the way to left (to shift the curve down). Then I lift the single voltage point I am interested in capping at up to the intended frequency. Then apply and save to a profile. I've never used alt. Does that help?


----------



## chibi

fray_bentos said:


> I reset to default, then open the VF curve, then shift the clock slider all the way to left (to shift the curve down). Then I lift the single voltage point I am interested in capping at up to the intended frequency. Then apply and save to a profile. I've never used alt. Does that help?


Thank you that worked, I was able to get my core stabilized at the 831mv / 1780 freq. Will play around with it a bit more and see if I can optimize further. Temps and power consumption are great!

PS, what are you guys able to get for memory? So far I have left it at default. do I just go up in 200 mhz intervals while keeping an eye on benchmark scores until it lowers due to the ecc?


----------



## fray_bentos

chibi said:


> Thank you that worked, I was able to get my core stabilized at the 831mv / 1780 freq. Will play around with it a bit more and see if I can optimize further. Temps and power consumption are great!
> 
> PS, what are you guys able to get for memory? So far I have left it at default. do I just go up in 200 mhz intervals while keeping an eye on benchmark scores until it lowers due to the ecc?


I've left my memory stock as increasing the memory clock seems to increase power consumption; it didn't seem like an energy efficient way of increasing performance (which defeats the purpose of undervolting). Similarly, I was put off by early reports of 3080Ti dying from memory failure.


----------



## Dziarson

First day of fight I scored 17 060 in Time Spy i like this Card so much i change from MSI 2070 super ventus OC now 3080 suprim x 10GB in this moment boost 2145 422W of power in HWinfo 64


----------



## mouacyk

Portal RTX power consumption and temps @3840x1440 DLSS 2.0 UltraPerf (1280x480) ~90fps, Perf is <60fps. RTX is quite the troll:


----------



## mouacyk

Dziarson said:


> First day of fight I scored 17 060 in Time Spy i like this Card so much i change from MSI 2070 super ventus OC now 3080 suprim x 10GB in this moment boost 2145 422W of power in HWinfo 64


the score that matters is 19,973


----------



## Dziarson

I did not wtite this  " I scored 17 060 in Time Spy " I scored 12 897 in Port Royal I scored 8 266 in Time Spy Extreme









This is auto **** 
I need better CPU 





Overclocking, overclocking, and much more! Like overclocking.


HWBOT is a site dedicated to overclocking. We promote overclocking achievements and competitions for professionals as well as enthousiasts with rankings and a huge hardware database.




hwbot.org


----------



## mouacyk

ouch W3 Next-Gen hurts so much.


----------



## mouacyk

I NEED a rtx 4090...

3840x1440 DLSS 2.0 Performance Mode, All RT options on, Graphics Preset High, Getting 100% GPU usage


----------



## Madouce203

mouacyk said:


> I NEED a rtx 4090...
> 
> 3840x1440 DLSS 2.0 Performance Mode, All RT options on, Graphics Preset High, Getting 100% GPU usage
> 
> View attachment 2588704


the card actually draws 300 watts and not 442
certainly a 3pin bios on a 2 pin cg


----------



## BIaze

What’s the best bios for a 3080 Strix 10G?

is it the stock bios or will flashing the suprim X bios net a gain?


----------



## mouacyk

Madouce203 said:


> the card actually draws 300 watts and not 442
> certainly a 3pin bios on a 2 pin cg


You're really late to the discussion. Best not be making assumptions.


----------



## chibi

Anyone know if I could slap on a Strix 3080/90 waterblock to the Strix 3080 TI LC aio?


----------



## BIaze

chibi said:


> Anyone know if I could slap on a Strix 3080/90 waterblock to the Strix 3080 TI LC aio?


You can, all the asus 3080/80 12GB/80ti/90 have the same pcb design


----------

